Apr 21 15:59:12.809782 ip-10-0-138-191 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 15:59:12.809796 ip-10-0-138-191 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 15:59:12.809807 ip-10-0-138-191 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 15:59:12.810022 ip-10-0-138-191 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 15:59:23.037191 ip-10-0-138-191 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 15:59:23.037212 ip-10-0-138-191 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 57ad9497614b4ee0875960e5dd93e822 -- Apr 21 16:01:51.501098 ip-10-0-138-191 systemd[1]: Starting Kubernetes Kubelet... Apr 21 16:01:51.978774 ip-10-0-138-191 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:51.978774 ip-10-0-138-191 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 16:01:51.978774 ip-10-0-138-191 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:51.978774 ip-10-0-138-191 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 16:01:51.978774 ip-10-0-138-191 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:51.981164 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.981094 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 16:01:51.983351 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983337 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:51.983351 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983351 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983354 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983359 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983361 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983365 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983367 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983370 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983373 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983376 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983379 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983381 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983384 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983387 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983389 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983392 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983394 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983397 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983401 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983405 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983408 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:51.983413 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983410 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983413 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983416 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983419 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983422 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983424 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983434 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983437 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983440 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983442 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983444 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983447 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983450 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983452 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983455 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983457 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983460 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983463 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983465 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:51.983900 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983468 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983470 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983472 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983475 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983477 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983480 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983483 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983485 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983488 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983491 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983494 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983497 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983500 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983502 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983504 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983508 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983510 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983513 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983515 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983518 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:51.984358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983521 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983523 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983526 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983528 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983531 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983533 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983536 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983538 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983541 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983544 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983546 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983549 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983551 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983555 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983559 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983562 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983567 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983571 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983574 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:51.984890 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983576 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983579 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983582 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983585 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983588 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983590 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983593 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983955 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983960 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983963 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983966 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983969 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983972 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983975 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983977 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983980 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983983 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983986 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983989 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:51.985344 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983991 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983994 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983996 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.983999 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984002 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984005 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984007 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984010 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984012 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984015 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984018 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984020 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984023 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984026 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984028 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984031 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984033 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984036 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984038 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:51.985818 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984041 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984044 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984047 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984052 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984054 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984057 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984060 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984062 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984065 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984067 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984070 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984073 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984075 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984078 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984080 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984083 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984085 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984088 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984090 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984093 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:51.986305 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984095 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984098 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984101 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984105 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984109 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984112 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984115 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984118 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984120 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984124 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984127 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984130 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984133 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984136 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984138 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984142 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984145 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984147 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984150 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:51.986805 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984152 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984155 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984157 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984160 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984162 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984165 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984167 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984170 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984172 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984177 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984179 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984182 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984184 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984187 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984190 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984192 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984259 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984273 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984281 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984285 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984289 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 16:01:51.987300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984292 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984296 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984301 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984304 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984308 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984311 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984315 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984318 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984322 2576 flags.go:64] FLAG: --cgroup-root="" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984324 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984327 2576 flags.go:64] FLAG: --client-ca-file="" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984330 2576 flags.go:64] FLAG: --cloud-config="" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984333 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984336 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984339 2576 flags.go:64] FLAG: --cluster-domain="" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984342 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984346 2576 flags.go:64] FLAG: --config-dir="" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984348 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984352 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984356 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984363 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984366 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984370 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984373 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 21 16:01:51.987818 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984376 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984379 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984382 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984384 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984389 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984392 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984395 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984398 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984401 2576 flags.go:64] FLAG: --enable-server="true" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984404 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984412 2576 flags.go:64] FLAG: --event-burst="100" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984415 2576 flags.go:64] FLAG: --event-qps="50" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984418 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984421 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984424 2576 flags.go:64] FLAG: --eviction-hard="" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984429 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984432 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984435 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984438 2576 flags.go:64] FLAG: --eviction-soft="" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984441 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984443 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984446 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984449 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984452 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984455 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 16:01:51.988425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984458 2576 flags.go:64] FLAG: --feature-gates="" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984461 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984464 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984468 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984471 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984474 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984477 2576 flags.go:64] FLAG: --help="false" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984480 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-138-191.ec2.internal" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984483 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984486 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984489 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984493 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984496 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984500 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984502 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984505 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984508 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984511 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984514 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984517 2576 flags.go:64] FLAG: --kube-reserved="" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984520 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984523 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984526 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984529 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 16:01:51.989073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984532 2576 flags.go:64] FLAG: --lock-file="" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984535 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984538 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984541 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984546 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984549 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984552 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984554 2576 flags.go:64] FLAG: --logging-format="text" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984557 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984560 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984563 2576 flags.go:64] FLAG: --manifest-url="" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984566 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984570 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984573 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984577 2576 flags.go:64] FLAG: --max-pods="110" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984580 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984583 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984586 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984589 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984592 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984595 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984597 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984605 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984608 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984611 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 16:01:51.989659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984614 2576 flags.go:64] FLAG: --pod-cidr="" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984617 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984622 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984625 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984628 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984631 2576 flags.go:64] FLAG: --port="10250" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984634 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984637 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bcd53649a298bbbd" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984640 2576 flags.go:64] FLAG: --qos-reserved="" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984643 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984646 2576 flags.go:64] FLAG: --register-node="true" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984649 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984652 2576 flags.go:64] FLAG: --register-with-taints="" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984656 2576 flags.go:64] FLAG: --registry-burst="10" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984659 2576 flags.go:64] FLAG: --registry-qps="5" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984662 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984665 2576 flags.go:64] FLAG: --reserved-memory="" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984668 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984671 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984677 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984680 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984682 2576 flags.go:64] FLAG: --runonce="false" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984685 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984688 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984691 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 21 16:01:51.990316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984694 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984697 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984700 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984702 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984706 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984709 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984712 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984715 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984717 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984720 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984723 2576 flags.go:64] FLAG: --system-cgroups="" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984726 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984731 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984734 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984737 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984741 2576 flags.go:64] FLAG: --tls-min-version="" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984744 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984746 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984749 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984752 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984755 2576 flags.go:64] FLAG: --v="2" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984759 2576 flags.go:64] FLAG: --version="false" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984763 2576 flags.go:64] FLAG: --vmodule="" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984767 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.984770 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 16:01:51.990933 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984879 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984884 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984887 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984890 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984893 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984896 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984899 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984902 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984905 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984907 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984910 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984912 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984915 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984918 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984921 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984923 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984926 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984928 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984931 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:51.991570 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984934 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984936 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984940 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984943 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984945 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984950 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984953 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984956 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984959 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984961 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984964 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984967 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984969 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984972 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984977 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984980 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984982 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984985 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984987 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984990 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:51.992056 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984993 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984995 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.984998 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985000 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985003 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985006 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985008 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985011 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985015 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985018 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985021 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985024 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985026 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985029 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985033 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985035 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985038 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985041 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985043 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985046 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:51.992568 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985048 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985051 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985053 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985056 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985058 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985061 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985065 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985068 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985070 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985073 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985076 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985078 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985081 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985083 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985086 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985088 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985091 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985093 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985096 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985098 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:51.993080 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985101 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985104 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985106 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985109 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985112 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985114 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.985118 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.985126 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.991167 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.991182 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991229 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991234 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991237 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991240 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991243 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991246 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:51.993582 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991249 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991251 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991254 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991256 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991259 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991262 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991264 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991267 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991269 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991272 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991274 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991277 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991279 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991282 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991285 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991288 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991290 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991293 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991295 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991297 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:51.994008 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991300 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991302 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991307 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991311 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991315 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991319 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991323 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991326 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991328 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991331 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991333 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991336 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991338 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991341 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991343 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991346 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991348 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991351 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991354 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991356 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:51.994500 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991359 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991361 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991364 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991366 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991369 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991371 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991374 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991376 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991379 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991381 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991384 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991386 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991389 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991391 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991394 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991397 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991399 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991402 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991406 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991408 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:51.995033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991411 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991413 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991416 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991418 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991421 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991423 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991426 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991428 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991431 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991433 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991436 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991438 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991441 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991443 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991447 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991450 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991453 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991456 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991459 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:51.995518 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991461 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.991466 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991560 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991564 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991567 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991569 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991572 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991575 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991578 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991580 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991583 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991586 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991589 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991591 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991594 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991597 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:51.996003 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991599 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991602 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991605 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991607 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991610 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991612 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991615 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991617 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991620 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991622 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991624 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991627 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991630 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991632 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991635 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991637 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991640 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991643 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991645 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:51.996405 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991647 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991651 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991653 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991656 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991658 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991661 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991663 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991666 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991669 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991671 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991675 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991677 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991680 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991682 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991685 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991687 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991690 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991692 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991695 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991697 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:51.996875 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991700 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991702 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991705 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991707 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991710 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991712 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991715 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991717 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991720 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991722 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991725 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991727 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991730 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991732 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991735 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991737 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991739 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991742 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991745 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991748 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:51.997358 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991752 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991756 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991759 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991763 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991765 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991768 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991770 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991773 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991775 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991778 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991780 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991783 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:51.991785 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.991790 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.992648 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 16:01:51.997941 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.995453 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 16:01:51.998312 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.996796 2576 server.go:1019] "Starting client certificate rotation" Apr 21 16:01:51.998312 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.996891 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 16:01:51.998312 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:51.996932 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 16:01:52.026918 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.026900 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 16:01:52.036009 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.035993 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 16:01:52.054785 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.054765 2576 log.go:25] "Validated CRI v1 runtime API" Apr 21 16:01:52.057467 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.057442 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 16:01:52.061220 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.061208 2576 log.go:25] "Validated CRI v1 image API" Apr 21 16:01:52.062403 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.062384 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 16:01:52.065470 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.065453 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9a07efb4-3b1e-44f1-91b4-ed76630a22f2:/dev/nvme0n1p4 c69ac013-bdb8-42ba-a22f-bbafd47eb3ee:/dev/nvme0n1p3] Apr 21 16:01:52.065532 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.065470 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 16:01:52.071091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.070986 2576 manager.go:217] Machine: {Timestamp:2026-04-21 16:01:52.069038171 +0000 UTC m=+0.445219974 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101826 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b70c92fdb5d52969d347b7a5501a9 SystemUUID:ec2b70c9-2fdb-5d52-969d-347b7a5501a9 BootID:57ad9497-614b-4ee0-8759-60e5dd93e822 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8e:5a:98:81:c5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8e:5a:98:81:c5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:e0:94:ab:8f:84 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 16:01:52.071091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.071086 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 16:01:52.071195 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.071151 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 16:01:52.074577 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.074551 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 16:01:52.074713 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.074580 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-191.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 16:01:52.074759 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.074722 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 16:01:52.074759 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.074730 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 16:01:52.074759 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.074747 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 16:01:52.076237 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.076226 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 16:01:52.077495 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.077473 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vkm45" Apr 21 16:01:52.077997 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.077987 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 16:01:52.078226 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.078216 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 16:01:52.081483 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.081474 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 21 16:01:52.081529 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.081506 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 16:01:52.081529 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.081520 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 16:01:52.081529 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.081530 2576 kubelet.go:397] "Adding apiserver pod source" Apr 21 16:01:52.081644 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.081548 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 16:01:52.082550 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.082537 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 16:01:52.082595 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.082557 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 16:01:52.085001 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.084984 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vkm45" Apr 21 16:01:52.086379 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.086364 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 16:01:52.088110 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.088095 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 16:01:52.090569 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090556 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 16:01:52.090609 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090582 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 16:01:52.090609 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090591 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 16:01:52.090609 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090597 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 16:01:52.090609 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090602 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 16:01:52.090609 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090607 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 16:01:52.090740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090613 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 16:01:52.090740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090618 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 16:01:52.090740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090626 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 16:01:52.090740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090631 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 16:01:52.090740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090642 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 16:01:52.090740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.090654 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 16:01:52.092720 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.092709 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 16:01:52.092756 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.092722 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 16:01:52.094710 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.094688 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:52.100418 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.100402 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 16:01:52.100526 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.100436 2576 server.go:1295] "Started kubelet" Apr 21 16:01:52.100899 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.100847 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 16:01:52.100987 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.100922 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 16:01:52.101141 ip-10-0-138-191 systemd[1]: Started Kubernetes Kubelet. Apr 21 16:01:52.101264 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.101238 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 16:01:52.102227 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.102135 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:52.102654 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.102636 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 16:01:52.104192 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.104175 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 21 16:01:52.105907 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.105891 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-191.ec2.internal" not found Apr 21 16:01:52.110902 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.110883 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 16:01:52.111572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.111555 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 16:01:52.111757 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.111733 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 16:01:52.112606 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112588 2576 factory.go:153] Registering CRI-O factory Apr 21 16:01:52.112687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112613 2576 factory.go:223] Registration of the crio container factory successfully Apr 21 16:01:52.112687 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.112654 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 21 16:01:52.112687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112665 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 16:01:52.112687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112677 2576 factory.go:55] Registering systemd factory Apr 21 16:01:52.112687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112685 2576 factory.go:223] Registration of the systemd container factory successfully Apr 21 16:01:52.112687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112685 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 16:01:52.113004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112698 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 16:01:52.113004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112705 2576 factory.go:103] Registering Raw factory Apr 21 16:01:52.113004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112717 2576 manager.go:1196] Started watching for new ooms in manager Apr 21 16:01:52.113004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112686 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 16:01:52.113004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112843 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 21 16:01:52.113004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.112868 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 21 16:01:52.113277 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.113113 2576 manager.go:319] Starting recovery of all containers Apr 21 16:01:52.114090 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.114070 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:52.117148 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.117018 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-191.ec2.internal\" not found" node="ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.121299 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.121282 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-191.ec2.internal" not found Apr 21 16:01:52.124015 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.124000 2576 manager.go:324] Recovery completed Apr 21 16:01:52.128159 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.128147 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:52.129970 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.129955 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:52.130027 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.129982 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:52.130027 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.129996 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:52.130445 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.130434 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 16:01:52.130445 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.130444 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 16:01:52.130519 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.130461 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 16:01:52.133052 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.133040 2576 policy_none.go:49] "None policy: Start" Apr 21 16:01:52.133096 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.133056 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 16:01:52.133096 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.133065 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 21 16:01:52.181590 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.181572 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-191.ec2.internal" not found Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.182384 2576 manager.go:341] "Starting Device Plugin manager" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.182406 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.182415 2576 server.go:85] "Starting device plugin registration server" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.182615 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.182628 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.182702 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.182763 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.182771 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.183215 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 16:01:52.184273 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.183249 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-191.ec2.internal\" not found" Apr 21 16:01:52.245322 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.245232 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 16:01:52.246916 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.246898 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 16:01:52.246990 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.246925 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 16:01:52.246990 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.246940 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 16:01:52.246990 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.246946 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 16:01:52.246990 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.246987 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 16:01:52.250308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.250289 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:52.283425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.283403 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:52.284238 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.284223 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:52.284313 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.284247 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:52.284313 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.284257 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:52.284313 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.284279 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.290769 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.290756 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.290815 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:52.290773 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-191.ec2.internal\": node \"ip-10-0-138-191.ec2.internal\" not found" Apr 21 16:01:52.347703 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.347683 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal"] Apr 21 16:01:52.350366 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.350350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.350366 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.350357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.380351 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.380333 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.384407 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.384393 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.394112 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.394099 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 16:01:52.414321 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.414299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09a45ec1566c454073ee33f001f99f61-config\") pod \"kube-apiserver-proxy-ip-10-0-138-191.ec2.internal\" (UID: \"09a45ec1566c454073ee33f001f99f61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.414400 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.414325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.414400 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.414342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.419463 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.419449 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 16:01:52.514918 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.514876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.514918 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.514901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.515013 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.514917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09a45ec1566c454073ee33f001f99f61-config\") pod \"kube-apiserver-proxy-ip-10-0-138-191.ec2.internal\" (UID: \"09a45ec1566c454073ee33f001f99f61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.515013 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.514950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.515013 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.514953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.515013 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.514980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09a45ec1566c454073ee33f001f99f61-config\") pod \"kube-apiserver-proxy-ip-10-0-138-191.ec2.internal\" (UID: \"09a45ec1566c454073ee33f001f99f61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.696717 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.696688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.722385 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.722360 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 21 16:01:52.996529 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.996510 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 16:01:52.997118 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.996629 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 16:01:52.997118 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.996632 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 16:01:52.997118 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:52.996642 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 16:01:53.082654 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.082607 2576 apiserver.go:52] "Watching apiserver" Apr 21 16:01:53.087577 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.087549 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:56:52 +0000 UTC" deadline="2027-10-30 12:43:18.734153765 +0000 UTC" Apr 21 16:01:53.087687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.087577 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13364h41m25.646580645s" Apr 21 16:01:53.089285 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.089267 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 16:01:53.089643 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.089625 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal","openshift-multus/multus-additional-cni-plugins-rxw6j","openshift-multus/network-metrics-daemon-clkll","openshift-network-operator/iptables-alerter-gdx82","kube-system/konnectivity-agent-6gg6g","kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv","openshift-dns/node-resolver-qdd9v","openshift-image-registry/node-ca-r68ws","openshift-multus/multus-hcrxk","openshift-network-diagnostics/network-check-target-pgrtm","openshift-ovn-kubernetes/ovnkube-node-7dm6j","openshift-cluster-node-tuning-operator/tuned-dgqxv"] Apr 21 16:01:53.091269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.091253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.092418 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.092399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:53.092490 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.092468 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:01:53.093682 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.093666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.093916 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.093852 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 16:01:53.093916 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.093891 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 16:01:53.093916 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.093869 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 16:01:53.094119 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.093935 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 16:01:53.094119 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.094110 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 16:01:53.094198 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.094115 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sknb7\"" Apr 21 16:01:53.095369 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.095309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.097212 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.097185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.098382 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.098362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.099534 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.099516 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.100878 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.100846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.102127 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.102110 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:53.102213 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.102168 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:01:53.102960 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.102945 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:01:53.103192 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.103180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 16:01:53.103480 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.103464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.104060 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.104042 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 16:01:53.104813 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.104792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.105000 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.104910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 16:01:53.105088 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105024 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 16:01:53.105147 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.104913 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lfvlh\"" Apr 21 16:01:53.105348 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105332 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 16:01:53.105414 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105355 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-grrhp\"" Apr 21 16:01:53.105468 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105444 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 16:01:53.105468 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105453 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-txncd\"" Apr 21 16:01:53.105573 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105453 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8t4mh\"" Apr 21 16:01:53.105573 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105558 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 16:01:53.105774 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105729 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 16:01:53.105774 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105757 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 16:01:53.105957 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105830 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 16:01:53.105957 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.105735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 16:01:53.106094 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106009 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zpz2n\"" Apr 21 16:01:53.106094 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106009 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fn8md\"" Apr 21 16:01:53.106216 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106094 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 16:01:53.106216 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106132 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 16:01:53.106216 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106146 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 16:01:53.106426 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106410 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 16:01:53.106468 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106444 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 16:01:53.106637 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 16:01:53.106700 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106684 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 16:01:53.106754 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106738 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2zw7l\"" Apr 21 16:01:53.106806 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.106763 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 16:01:53.107694 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.107673 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 16:01:53.107782 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.107770 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:01:53.108150 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.108128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x5zds\"" Apr 21 16:01:53.111439 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.111424 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 16:01:53.113462 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.113448 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 16:01:53.119400 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-host\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.119484 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdjx5\" (UniqueName: \"kubernetes.io/projected/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-kube-api-access-fdjx5\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.119484 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-os-release\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.119484 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-modprobe-d\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.119619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysconfig\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.119619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-multus-certs\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.119619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:53.119619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-run\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.119764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-sys-fs\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.119764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-slash\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.119764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-env-overrides\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.119878 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-cni-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.119878 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-conf-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.119878 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.119878 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0983fe40-299f-4402-9953-2982f995ea8f-serviceca\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.120021 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-kubernetes\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.120021 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysctl-conf\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.120021 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-etc-selinux\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.120021 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.119996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.120199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0983fe40-299f-4402-9953-2982f995ea8f-host\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.120199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-etc-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.120199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-system-cni-dir\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.120199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-os-release\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.120199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-sys\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.120199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/81b42d6a-976a-40b8-887e-6faf9b28d9b4-agent-certs\") pod \"konnectivity-agent-6gg6g\" (UID: \"81b42d6a-976a-40b8-887e-6faf9b28d9b4\") " pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.120387 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-run-ovn-kubernetes\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.120387 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.120387 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-socket-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.120387 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-iptables-alerter-script\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.120387 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/81b42d6a-976a-40b8-887e-6faf9b28d9b4-konnectivity-ca\") pod \"konnectivity-agent-6gg6g\" (UID: \"81b42d6a-976a-40b8-887e-6faf9b28d9b4\") " pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.120387 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.120567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxv7\" (UniqueName: \"kubernetes.io/projected/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-kube-api-access-8fxv7\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.120567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-cni-netd\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.120567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.120567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mgv\" (UniqueName: \"kubernetes.io/projected/b782d8e0-5d5e-4897-91b4-49f7049c7fac-kube-api-access-c4mgv\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:53.120567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-systemd\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.120567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-host-slash\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.120567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-node-log\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-cni-bin\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3debd313-0085-4668-87ee-27bfcceec0f2-hosts-file\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42d82\" (UniqueName: \"kubernetes.io/projected/0983fe40-299f-4402-9953-2982f995ea8f-kube-api-access-42d82\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-cni-binary-copy\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx59q\" (UniqueName: \"kubernetes.io/projected/3644fb96-b246-4565-8e9c-3e12c0ff7d41-kube-api-access-jx59q\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzt7\" (UniqueName: \"kubernetes.io/projected/c5c002ed-ea31-4b63-a340-208a5f1f28e3-kube-api-access-mdzt7\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3debd313-0085-4668-87ee-27bfcceec0f2-tmp-dir\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.120787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-k8s-cni-cncf-io\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-netns\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysctl-d\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-systemd-units\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-run-netns\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.120987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-var-lib-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-log-socket\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovn-node-metrics-cert\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4p9s\" (UniqueName: \"kubernetes.io/projected/3debd313-0085-4668-87ee-27bfcceec0f2-kube-api-access-p4p9s\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-system-cni-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-registration-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-tuned\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.121155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3644fb96-b246-4565-8e9c-3e12c0ff7d41-tmp\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-kubelet\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cnibin\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-cnibin\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-socket-dir-parent\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121247 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-cni-bin\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-cni-multus\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-kubelet\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-daemon-config\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-lib-modules\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-device-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-ovn\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovnkube-script-lib\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-hostroot\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkxgg\" (UniqueName: \"kubernetes.io/projected/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-kube-api-access-dkxgg\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.121612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-systemd\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.122179 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovnkube-config\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.122179 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbj2f\" (UniqueName: \"kubernetes.io/projected/eb33d6a3-42f4-4598-8722-83e1ccb59eae-kube-api-access-xbj2f\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.122179 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-etc-kubernetes\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.122179 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.121523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-var-lib-kubelet\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.126567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.126544 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 16:01:53.146554 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.146534 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-d6v5q" Apr 21 16:01:53.156449 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.156429 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-d6v5q" Apr 21 16:01:53.222058 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-cni-bin\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222141 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:53.222141 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3debd313-0085-4668-87ee-27bfcceec0f2-hosts-file\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.222141 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42d82\" (UniqueName: \"kubernetes.io/projected/0983fe40-299f-4402-9953-2982f995ea8f-kube-api-access-42d82\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-cni-bin\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-cni-binary-copy\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx59q\" (UniqueName: \"kubernetes.io/projected/3644fb96-b246-4565-8e9c-3e12c0ff7d41-kube-api-access-jx59q\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3debd313-0085-4668-87ee-27bfcceec0f2-hosts-file\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.222206 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzt7\" (UniqueName: \"kubernetes.io/projected/c5c002ed-ea31-4b63-a340-208a5f1f28e3-kube-api-access-mdzt7\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.222266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3debd313-0085-4668-87ee-27bfcceec0f2-tmp-dir\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.222302 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:01:53.72228128 +0000 UTC m=+2.098463086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-k8s-cni-cncf-io\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-netns\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysctl-d\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-systemd-units\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-run-netns\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-netns\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-k8s-cni-cncf-io\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-var-lib-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-log-socket\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-run-netns\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovn-node-metrics-cert\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-var-lib-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysctl-d\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-systemd-units\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.222636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4p9s\" (UniqueName: \"kubernetes.io/projected/3debd313-0085-4668-87ee-27bfcceec0f2-kube-api-access-p4p9s\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-log-socket\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3debd313-0085-4668-87ee-27bfcceec0f2-tmp-dir\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-system-cni-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-system-cni-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-registration-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-tuned\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3644fb96-b246-4565-8e9c-3e12c0ff7d41-tmp\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-cni-binary-copy\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-registration-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-kubelet\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cnibin\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-kubelet\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cnibin\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-cnibin\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-socket-dir-parent\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-cnibin\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.223437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-cni-bin\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-socket-dir-parent\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-cni-multus\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-kubelet\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-cni-bin\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-daemon-config\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222912 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-cni-multus\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-lib-modules\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-var-lib-kubelet\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.222977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-device-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-ovn\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovnkube-script-lib\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-hostroot\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-lib-modules\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkxgg\" (UniqueName: \"kubernetes.io/projected/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-kube-api-access-dkxgg\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-ovn\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-systemd\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.224270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovnkube-config\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-systemd\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-hostroot\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-daemon-config\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbj2f\" (UniqueName: \"kubernetes.io/projected/eb33d6a3-42f4-4598-8722-83e1ccb59eae-kube-api-access-xbj2f\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-etc-kubernetes\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-var-lib-kubelet\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-host\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdjx5\" (UniqueName: \"kubernetes.io/projected/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-kube-api-access-fdjx5\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-os-release\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-modprobe-d\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysconfig\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-multus-certs\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-run\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovnkube-config\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.225105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-sys-fs\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-slash\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-device-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-env-overrides\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-host-run-multus-certs\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-sys-fs\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-cni-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-os-release\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-conf-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-modprobe-d\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.223976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0983fe40-299f-4402-9953-2982f995ea8f-serviceca\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-kubernetes\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysconfig\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-run\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysctl-conf\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-etc-kubernetes\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-cni-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.225910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-etc-selinux\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-host\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0983fe40-299f-4402-9953-2982f995ea8f-host\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-slash\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-etc-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-system-cni-dir\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-os-release\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-sysctl-conf\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-sys\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/81b42d6a-976a-40b8-887e-6faf9b28d9b4-agent-certs\") pod \"konnectivity-agent-6gg6g\" (UID: \"81b42d6a-976a-40b8-887e-6faf9b28d9b4\") " pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-run-ovn-kubernetes\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-run-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-socket-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-iptables-alerter-script\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/81b42d6a-976a-40b8-887e-6faf9b28d9b4-konnectivity-ca\") pod \"konnectivity-agent-6gg6g\" (UID: \"81b42d6a-976a-40b8-887e-6faf9b28d9b4\") " pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.226749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxv7\" (UniqueName: \"kubernetes.io/projected/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-kube-api-access-8fxv7\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-cni-netd\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-kubernetes\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mgv\" (UniqueName: \"kubernetes.io/projected/b782d8e0-5d5e-4897-91b4-49f7049c7fac-kube-api-access-c4mgv\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-systemd\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-run-ovn-kubernetes\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovnkube-script-lib\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-host-slash\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-node-log\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-etc-openvswitch\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-host-slash\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-var-lib-kubelet\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-system-cni-dir\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-socket-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5c002ed-ea31-4b63-a340-208a5f1f28e3-os-release\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.227600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-multus-conf-dir\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb33d6a3-42f4-4598-8722-83e1ccb59eae-env-overrides\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.224987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-sys\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-host-cni-netd\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-systemd\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c5c002ed-ea31-4b63-a340-208a5f1f28e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0983fe40-299f-4402-9953-2982f995ea8f-host\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/81b42d6a-976a-40b8-887e-6faf9b28d9b4-konnectivity-ca\") pod \"konnectivity-agent-6gg6g\" (UID: \"81b42d6a-976a-40b8-887e-6faf9b28d9b4\") " pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-etc-selinux\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb33d6a3-42f4-4598-8722-83e1ccb59eae-node-log\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0983fe40-299f-4402-9953-2982f995ea8f-serviceca\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.225960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-iptables-alerter-script\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.226488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3644fb96-b246-4565-8e9c-3e12c0ff7d41-tmp\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.226599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb33d6a3-42f4-4598-8722-83e1ccb59eae-ovn-node-metrics-cert\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.227380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3644fb96-b246-4565-8e9c-3e12c0ff7d41-etc-tuned\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.228116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.227932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/81b42d6a-976a-40b8-887e-6faf9b28d9b4-agent-certs\") pod \"konnectivity-agent-6gg6g\" (UID: \"81b42d6a-976a-40b8-887e-6faf9b28d9b4\") " pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.232734 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.232714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx59q\" (UniqueName: \"kubernetes.io/projected/3644fb96-b246-4565-8e9c-3e12c0ff7d41-kube-api-access-jx59q\") pod \"tuned-dgqxv\" (UID: \"3644fb96-b246-4565-8e9c-3e12c0ff7d41\") " pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.236098 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.236079 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:53.236195 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.236103 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:53.236195 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.236116 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ck9m8 for pod openshift-network-diagnostics/network-check-target-pgrtm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:53.236195 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.236192 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8 podName:9a9fc212-5dd1-4633-82ec-c0c7aeee954f nodeName:}" failed. No retries permitted until 2026-04-21 16:01:53.736174926 +0000 UTC m=+2.112356717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ck9m8" (UniqueName: "kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8") pod "network-check-target-pgrtm" (UID: "9a9fc212-5dd1-4633-82ec-c0c7aeee954f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:53.236742 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.236706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkxgg\" (UniqueName: \"kubernetes.io/projected/4ecc38cb-25f5-4bc0-ae83-f6897b537b0a-kube-api-access-dkxgg\") pod \"multus-hcrxk\" (UID: \"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a\") " pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.237183 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.237152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdjx5\" (UniqueName: \"kubernetes.io/projected/ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1-kube-api-access-fdjx5\") pod \"iptables-alerter-gdx82\" (UID: \"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1\") " pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.241688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.237979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbj2f\" (UniqueName: \"kubernetes.io/projected/eb33d6a3-42f4-4598-8722-83e1ccb59eae-kube-api-access-xbj2f\") pod \"ovnkube-node-7dm6j\" (UID: \"eb33d6a3-42f4-4598-8722-83e1ccb59eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.241688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.238308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42d82\" (UniqueName: \"kubernetes.io/projected/0983fe40-299f-4402-9953-2982f995ea8f-kube-api-access-42d82\") pod \"node-ca-r68ws\" (UID: \"0983fe40-299f-4402-9953-2982f995ea8f\") " pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.241688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.239689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mgv\" (UniqueName: \"kubernetes.io/projected/b782d8e0-5d5e-4897-91b4-49f7049c7fac-kube-api-access-c4mgv\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:53.241688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.241008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4p9s\" (UniqueName: \"kubernetes.io/projected/3debd313-0085-4668-87ee-27bfcceec0f2-kube-api-access-p4p9s\") pod \"node-resolver-qdd9v\" (UID: \"3debd313-0085-4668-87ee-27bfcceec0f2\") " pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.242661 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.242645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxv7\" (UniqueName: \"kubernetes.io/projected/2eb59892-b9e2-465a-8cee-e0ff36f2f66f-kube-api-access-8fxv7\") pod \"aws-ebs-csi-driver-node-7pwjv\" (UID: \"2eb59892-b9e2-465a-8cee-e0ff36f2f66f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.242740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.242698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzt7\" (UniqueName: \"kubernetes.io/projected/c5c002ed-ea31-4b63-a340-208a5f1f28e3-kube-api-access-mdzt7\") pod \"multus-additional-cni-plugins-rxw6j\" (UID: \"c5c002ed-ea31-4b63-a340-208a5f1f28e3\") " pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.248295 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.248269 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a45ec1566c454073ee33f001f99f61.slice/crio-7cedb700333d5cfc4fdc60e51fa926f03353ada8ffda620e79b9ae2c9254a525 WatchSource:0}: Error finding container 7cedb700333d5cfc4fdc60e51fa926f03353ada8ffda620e79b9ae2c9254a525: Status 404 returned error can't find the container with id 7cedb700333d5cfc4fdc60e51fa926f03353ada8ffda620e79b9ae2c9254a525 Apr 21 16:01:53.248812 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.248796 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9ac9a5ecee46125a37b1b7d1e8dc22.slice/crio-f373db28c066e1c1fe4e1f7aaf52acf282216f46035300b178ba64e757c96505 WatchSource:0}: Error finding container f373db28c066e1c1fe4e1f7aaf52acf282216f46035300b178ba64e757c96505: Status 404 returned error can't find the container with id f373db28c066e1c1fe4e1f7aaf52acf282216f46035300b178ba64e757c96505 Apr 21 16:01:53.253227 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.253208 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:01:53.254163 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.254146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:01:53.258870 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.258842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" Apr 21 16:01:53.261086 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.261060 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb33d6a3_42f4_4598_8722_83e1ccb59eae.slice/crio-ea8b159610520f7860676e63fc36293025ed527a33e67981a82d5ce911cc7873 WatchSource:0}: Error finding container ea8b159610520f7860676e63fc36293025ed527a33e67981a82d5ce911cc7873: Status 404 returned error can't find the container with id ea8b159610520f7860676e63fc36293025ed527a33e67981a82d5ce911cc7873 Apr 21 16:01:53.265424 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.265395 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3644fb96_b246_4565_8e9c_3e12c0ff7d41.slice/crio-3c1f2883098402a33ff2621d1f9ecf52d58544f9faa87d2003edbf014896ea1c WatchSource:0}: Error finding container 3c1f2883098402a33ff2621d1f9ecf52d58544f9faa87d2003edbf014896ea1c: Status 404 returned error can't find the container with id 3c1f2883098402a33ff2621d1f9ecf52d58544f9faa87d2003edbf014896ea1c Apr 21 16:01:53.417751 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.417733 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" Apr 21 16:01:53.423219 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.423196 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c002ed_ea31_4b63_a340_208a5f1f28e3.slice/crio-ee12dd4ccec002301541a7241b57b92ea58e308f91c8888f2987186fcceec48a WatchSource:0}: Error finding container ee12dd4ccec002301541a7241b57b92ea58e308f91c8888f2987186fcceec48a: Status 404 returned error can't find the container with id ee12dd4ccec002301541a7241b57b92ea58e308f91c8888f2987186fcceec48a Apr 21 16:01:53.440084 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.440064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gdx82" Apr 21 16:01:53.445707 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.445684 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded5b38e2_9d42_4cbc_8f24_8c6444ebb5a1.slice/crio-457a4596cbe621db08b66b92f296f9921cde97e1bf823a0115722e503db2988e WatchSource:0}: Error finding container 457a4596cbe621db08b66b92f296f9921cde97e1bf823a0115722e503db2988e: Status 404 returned error can't find the container with id 457a4596cbe621db08b66b92f296f9921cde97e1bf823a0115722e503db2988e Apr 21 16:01:53.461085 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.461069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:01:53.465971 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.465953 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81b42d6a_976a_40b8_887e_6faf9b28d9b4.slice/crio-c5bd857c2e754d1b1c06dae0e401188ed4566b235437bbe61431d30c9c249ff7 WatchSource:0}: Error finding container c5bd857c2e754d1b1c06dae0e401188ed4566b235437bbe61431d30c9c249ff7: Status 404 returned error can't find the container with id c5bd857c2e754d1b1c06dae0e401188ed4566b235437bbe61431d30c9c249ff7 Apr 21 16:01:53.470696 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.470681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" Apr 21 16:01:53.476223 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.476207 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb59892_b9e2_465a_8cee_e0ff36f2f66f.slice/crio-cf49bb4128563fe547808c28c99d2f0e0f4dba90168ed3f6a299184acb0e6e32 WatchSource:0}: Error finding container cf49bb4128563fe547808c28c99d2f0e0f4dba90168ed3f6a299184acb0e6e32: Status 404 returned error can't find the container with id cf49bb4128563fe547808c28c99d2f0e0f4dba90168ed3f6a299184acb0e6e32 Apr 21 16:01:53.485184 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.485167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qdd9v" Apr 21 16:01:53.490112 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.490091 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3debd313_0085_4668_87ee_27bfcceec0f2.slice/crio-4071ae6496b0396f138042ad8a8aadbe6b6aafe90e649257478f93eb1af61baf WatchSource:0}: Error finding container 4071ae6496b0396f138042ad8a8aadbe6b6aafe90e649257478f93eb1af61baf: Status 404 returned error can't find the container with id 4071ae6496b0396f138042ad8a8aadbe6b6aafe90e649257478f93eb1af61baf Apr 21 16:01:53.503657 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.503641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r68ws" Apr 21 16:01:53.508526 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.508506 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0983fe40_299f_4402_9953_2982f995ea8f.slice/crio-a57dfa593802e85b45b5f629bb4e6d2ec852ef669be85cf2bee91b155ddf71ec WatchSource:0}: Error finding container a57dfa593802e85b45b5f629bb4e6d2ec852ef669be85cf2bee91b155ddf71ec: Status 404 returned error can't find the container with id a57dfa593802e85b45b5f629bb4e6d2ec852ef669be85cf2bee91b155ddf71ec Apr 21 16:01:53.519363 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.519347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hcrxk" Apr 21 16:01:53.524359 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:01:53.524336 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ecc38cb_25f5_4bc0_ae83_f6897b537b0a.slice/crio-1a211c13d38f18f3c91b891de5fdcf65d540a70e26dc767e65c6db22655a0b1b WatchSource:0}: Error finding container 1a211c13d38f18f3c91b891de5fdcf65d540a70e26dc767e65c6db22655a0b1b: Status 404 returned error can't find the container with id 1a211c13d38f18f3c91b891de5fdcf65d540a70e26dc767e65c6db22655a0b1b Apr 21 16:01:53.729761 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.729688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:53.729925 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.729833 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:53.729925 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.729912 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:01:54.729890527 +0000 UTC m=+3.106072320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:53.812078 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.812052 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:53.830553 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.830526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:53.830678 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.830665 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:53.830742 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.830683 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:53.830742 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.830696 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ck9m8 for pod openshift-network-diagnostics/network-check-target-pgrtm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:53.830919 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:53.830750 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8 podName:9a9fc212-5dd1-4633-82ec-c0c7aeee954f nodeName:}" failed. No retries permitted until 2026-04-21 16:01:54.830732058 +0000 UTC m=+3.206913861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ck9m8" (UniqueName: "kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8") pod "network-check-target-pgrtm" (UID: "9a9fc212-5dd1-4633-82ec-c0c7aeee954f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:53.975902 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.975875 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:53.999414 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:53.999217 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:54.158251 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.158080 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:56:53 +0000 UTC" deadline="2027-12-02 07:31:03.672192609 +0000 UTC" Apr 21 16:01:54.158251 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.158120 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14151h29m9.514076662s" Apr 21 16:01:54.249716 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.249465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:54.249716 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:54.249610 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:01:54.280660 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.280604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hcrxk" event={"ID":"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a","Type":"ContainerStarted","Data":"1a211c13d38f18f3c91b891de5fdcf65d540a70e26dc767e65c6db22655a0b1b"} Apr 21 16:01:54.286699 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.286667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r68ws" event={"ID":"0983fe40-299f-4402-9953-2982f995ea8f","Type":"ContainerStarted","Data":"a57dfa593802e85b45b5f629bb4e6d2ec852ef669be85cf2bee91b155ddf71ec"} Apr 21 16:01:54.293058 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.293031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qdd9v" event={"ID":"3debd313-0085-4668-87ee-27bfcceec0f2","Type":"ContainerStarted","Data":"4071ae6496b0396f138042ad8a8aadbe6b6aafe90e649257478f93eb1af61baf"} Apr 21 16:01:54.309147 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.309123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" event={"ID":"2eb59892-b9e2-465a-8cee-e0ff36f2f66f","Type":"ContainerStarted","Data":"cf49bb4128563fe547808c28c99d2f0e0f4dba90168ed3f6a299184acb0e6e32"} Apr 21 16:01:54.313471 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.313447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6gg6g" event={"ID":"81b42d6a-976a-40b8-887e-6faf9b28d9b4","Type":"ContainerStarted","Data":"c5bd857c2e754d1b1c06dae0e401188ed4566b235437bbe61431d30c9c249ff7"} Apr 21 16:01:54.326670 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.326633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gdx82" event={"ID":"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1","Type":"ContainerStarted","Data":"457a4596cbe621db08b66b92f296f9921cde97e1bf823a0115722e503db2988e"} Apr 21 16:01:54.336556 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.336528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerStarted","Data":"ee12dd4ccec002301541a7241b57b92ea58e308f91c8888f2987186fcceec48a"} Apr 21 16:01:54.356398 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.356366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" event={"ID":"7e9ac9a5ecee46125a37b1b7d1e8dc22","Type":"ContainerStarted","Data":"f373db28c066e1c1fe4e1f7aaf52acf282216f46035300b178ba64e757c96505"} Apr 21 16:01:54.376307 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.376234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" event={"ID":"3644fb96-b246-4565-8e9c-3e12c0ff7d41","Type":"ContainerStarted","Data":"3c1f2883098402a33ff2621d1f9ecf52d58544f9faa87d2003edbf014896ea1c"} Apr 21 16:01:54.386382 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.386263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"ea8b159610520f7860676e63fc36293025ed527a33e67981a82d5ce911cc7873"} Apr 21 16:01:54.403982 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.403956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" event={"ID":"09a45ec1566c454073ee33f001f99f61","Type":"ContainerStarted","Data":"7cedb700333d5cfc4fdc60e51fa926f03353ada8ffda620e79b9ae2c9254a525"} Apr 21 16:01:54.738484 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.738383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:54.738643 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:54.738552 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:54.738643 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:54.738633 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:01:56.738609325 +0000 UTC m=+5.114791117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:54.839027 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:54.838993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:54.839199 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:54.839144 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:54.839199 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:54.839161 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:54.839199 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:54.839173 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ck9m8 for pod openshift-network-diagnostics/network-check-target-pgrtm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:54.839365 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:54.839228 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8 podName:9a9fc212-5dd1-4633-82ec-c0c7aeee954f nodeName:}" failed. No retries permitted until 2026-04-21 16:01:56.839209006 +0000 UTC m=+5.215390798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ck9m8" (UniqueName: "kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8") pod "network-check-target-pgrtm" (UID: "9a9fc212-5dd1-4633-82ec-c0c7aeee954f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:55.159129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:55.159044 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:56:53 +0000 UTC" deadline="2027-10-08 21:10:25.386417812 +0000 UTC" Apr 21 16:01:55.159129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:55.159076 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12845h8m30.227344912s" Apr 21 16:01:55.248255 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:55.248228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:55.248403 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:55.248350 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:01:56.248702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:56.248469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:56.249146 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:56.248831 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:01:56.753902 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:56.753340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:56.753902 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:56.753472 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:56.753902 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:56.753554 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:02:00.753533803 +0000 UTC m=+9.129715597 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:56.854292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:56.854094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:56.854292 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:56.854273 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:56.854292 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:56.854291 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:56.854532 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:56.854303 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ck9m8 for pod openshift-network-diagnostics/network-check-target-pgrtm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:56.854532 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:56.854364 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8 podName:9a9fc212-5dd1-4633-82ec-c0c7aeee954f nodeName:}" failed. No retries permitted until 2026-04-21 16:02:00.854344878 +0000 UTC m=+9.230526670 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ck9m8" (UniqueName: "kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8") pod "network-check-target-pgrtm" (UID: "9a9fc212-5dd1-4633-82ec-c0c7aeee954f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:57.247381 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:57.247305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:57.247547 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:57.247429 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:01:58.247705 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:58.247650 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:01:58.248237 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:58.247789 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:01:59.247770 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:01:59.247738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:01:59.248235 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:01:59.247898 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:00.248317 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:00.247788 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:00.248317 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:00.247951 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:00.785660 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:00.785582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:00.785843 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:00.785711 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:00.785843 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:00.785775 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:02:08.785755799 +0000 UTC m=+17.161937589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:00.886331 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:00.886260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:00.886494 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:00.886406 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:02:00.886494 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:00.886425 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:02:00.886494 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:00.886436 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ck9m8 for pod openshift-network-diagnostics/network-check-target-pgrtm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:00.886494 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:00.886492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8 podName:9a9fc212-5dd1-4633-82ec-c0c7aeee954f nodeName:}" failed. No retries permitted until 2026-04-21 16:02:08.886475401 +0000 UTC m=+17.262657205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ck9m8" (UniqueName: "kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8") pod "network-check-target-pgrtm" (UID: "9a9fc212-5dd1-4633-82ec-c0c7aeee954f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:01.248084 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:01.248005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:01.248241 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:01.248142 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:02.248197 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:02.248158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:02.248627 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:02.248281 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:03.247942 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:03.247872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:03.248080 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:03.248010 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:04.247551 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:04.247513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:04.247999 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:04.247629 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:05.247557 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:05.247525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:05.247970 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:05.247634 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:06.247144 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:06.247109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:06.247305 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:06.247251 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:07.247704 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:07.247673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:07.248177 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:07.247790 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:08.247530 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:08.247495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:08.247706 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:08.247668 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:08.847668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:08.847635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:08.848105 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:08.847749 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:08.848105 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:08.847817 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:02:24.847796198 +0000 UTC m=+33.223977997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:08.948598 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:08.948571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:08.948732 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:08.948707 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:02:08.948732 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:08.948728 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:02:08.948829 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:08.948740 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ck9m8 for pod openshift-network-diagnostics/network-check-target-pgrtm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:08.948829 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:08.948797 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8 podName:9a9fc212-5dd1-4633-82ec-c0c7aeee954f nodeName:}" failed. No retries permitted until 2026-04-21 16:02:24.948781939 +0000 UTC m=+33.324963744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ck9m8" (UniqueName: "kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8") pod "network-check-target-pgrtm" (UID: "9a9fc212-5dd1-4633-82ec-c0c7aeee954f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:09.247358 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:09.247290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:09.247514 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:09.247399 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:10.248190 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:10.248160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:10.248617 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:10.248269 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:11.247925 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:11.247902 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:11.248025 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:11.248005 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:11.438289 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:11.438098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"dc07fe72e3a92ac6ca382bd298d3792deffc44980980de697c517bf2c22e0ee1"} Apr 21 16:02:11.441990 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:11.440966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" event={"ID":"09a45ec1566c454073ee33f001f99f61","Type":"ContainerStarted","Data":"5e90d441b0148ef7c6772e8c2204bed05fffd20e98b1fc984ed2e4ad112c548a"} Apr 21 16:02:11.449300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:11.449276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" event={"ID":"3644fb96-b246-4565-8e9c-3e12c0ff7d41","Type":"ContainerStarted","Data":"44e39d08bd3021d4f051f64bf7ee224e2f5ebfc844c3f35b2546bcd2afd4618f"} Apr 21 16:02:11.457721 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:11.457640 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" podStartSLOduration=19.457628829 podStartE2EDuration="19.457628829s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:11.457407594 +0000 UTC m=+19.833589405" watchObservedRunningTime="2026-04-21 16:02:11.457628829 +0000 UTC m=+19.833810640" Apr 21 16:02:11.476694 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:11.476654 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dgqxv" podStartSLOduration=1.6440043439999998 podStartE2EDuration="19.47663968s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.267325494 +0000 UTC m=+1.643507282" lastFinishedPulling="2026-04-21 16:02:11.099960818 +0000 UTC m=+19.476142618" observedRunningTime="2026-04-21 16:02:11.476036318 +0000 UTC m=+19.852218132" watchObservedRunningTime="2026-04-21 16:02:11.47663968 +0000 UTC m=+19.852821491" Apr 21 16:02:12.248303 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.248124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:12.248414 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:12.248330 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:12.454643 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.454612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hcrxk" event={"ID":"4ecc38cb-25f5-4bc0-ae83-f6897b537b0a","Type":"ContainerStarted","Data":"e04e7b8a254af9e732feebc7c3f4618ba16f0b6fc6d2af4be4acc33c4012dea4"} Apr 21 16:02:12.455911 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.455890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r68ws" event={"ID":"0983fe40-299f-4402-9953-2982f995ea8f","Type":"ContainerStarted","Data":"2fe5db5153ea55be0118499f2e69280fd6e596a20d10c85f7bd332cb2eb731e6"} Apr 21 16:02:12.457111 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.457093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qdd9v" event={"ID":"3debd313-0085-4668-87ee-27bfcceec0f2","Type":"ContainerStarted","Data":"0b7856c3f3b6bb25096ff1deead2a7aaf7602ee7b1bf1b8a94a2b2421721cfac"} Apr 21 16:02:12.458321 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.458299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" event={"ID":"2eb59892-b9e2-465a-8cee-e0ff36f2f66f","Type":"ContainerStarted","Data":"f5b94a20506af132f6db6311f59fbbf198667a161878b310f07f5ebf9c827021"} Apr 21 16:02:12.459454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.459433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6gg6g" event={"ID":"81b42d6a-976a-40b8-887e-6faf9b28d9b4","Type":"ContainerStarted","Data":"41772e17a7073acad359f8d205856eabcf3cf0bc332f6c6ac91706b13849a748"} Apr 21 16:02:12.460662 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.460638 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c002ed-ea31-4b63-a340-208a5f1f28e3" containerID="2a86b3b97b0a1001cba418d59395316ac419b55e626f960e57e566b0bb5cf941" exitCode=0 Apr 21 16:02:12.460745 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.460691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerDied","Data":"2a86b3b97b0a1001cba418d59395316ac419b55e626f960e57e566b0bb5cf941"} Apr 21 16:02:12.462033 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.461970 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e9ac9a5ecee46125a37b1b7d1e8dc22" containerID="14cc2ab3632c97a86553593bdb2969d6ba591ab7a00c076184a72c7aeca60a89" exitCode=0 Apr 21 16:02:12.462089 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.462037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" event={"ID":"7e9ac9a5ecee46125a37b1b7d1e8dc22","Type":"ContainerDied","Data":"14cc2ab3632c97a86553593bdb2969d6ba591ab7a00c076184a72c7aeca60a89"} Apr 21 16:02:12.464707 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.464690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:02:12.465041 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.465023 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb33d6a3-42f4-4598-8722-83e1ccb59eae" containerID="798d8716b689488a2c0637d9cb075ccd29370af7c449fbceaa35df9b8cbd45f0" exitCode=1 Apr 21 16:02:12.465116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.465087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"5789ff53b4a7de93b6b994b8ec63b0fd1658ba113f897cb80000ca9753898779"} Apr 21 16:02:12.465116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.465110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"cc7cbb1f6d5d64ddb13222cf813c01cadd68d881fda58fa0dc9afd4960508a69"} Apr 21 16:02:12.465238 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.465123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"62c7cf881fb506c887f4c3c65ccb16d98ca9922f6d0d72f46b26866986b1a824"} Apr 21 16:02:12.465238 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.465131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"a377b4e23c38be92d09f122e936d7927c314bb20547fb513973b172d51628132"} Apr 21 16:02:12.465238 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.465140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerDied","Data":"798d8716b689488a2c0637d9cb075ccd29370af7c449fbceaa35df9b8cbd45f0"} Apr 21 16:02:12.472923 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.472882 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hcrxk" podStartSLOduration=2.5887636990000003 podStartE2EDuration="20.472851828s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.525630355 +0000 UTC m=+1.901812144" lastFinishedPulling="2026-04-21 16:02:11.409718462 +0000 UTC m=+19.785900273" observedRunningTime="2026-04-21 16:02:12.472849569 +0000 UTC m=+20.849031381" watchObservedRunningTime="2026-04-21 16:02:12.472851828 +0000 UTC m=+20.849033639" Apr 21 16:02:12.530511 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.530424 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qdd9v" podStartSLOduration=2.948523657 podStartE2EDuration="20.530408346s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.491511101 +0000 UTC m=+1.867692890" lastFinishedPulling="2026-04-21 16:02:11.073395776 +0000 UTC m=+19.449577579" observedRunningTime="2026-04-21 16:02:12.530226021 +0000 UTC m=+20.906407834" watchObservedRunningTime="2026-04-21 16:02:12.530408346 +0000 UTC m=+20.906590158" Apr 21 16:02:12.566136 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:12.566087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6gg6g" podStartSLOduration=2.934004168 podStartE2EDuration="20.566075689s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.467311802 +0000 UTC m=+1.843493594" lastFinishedPulling="2026-04-21 16:02:11.099383311 +0000 UTC m=+19.475565115" observedRunningTime="2026-04-21 16:02:12.549502149 +0000 UTC m=+20.925683959" watchObservedRunningTime="2026-04-21 16:02:12.566075689 +0000 UTC m=+20.942257481" Apr 21 16:02:13.027246 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.027190 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 16:02:13.195875 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.195704 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T16:02:13.027211275Z","UUID":"cb94e5b9-400b-4be7-8f42-735efa2488c7","Handler":null,"Name":"","Endpoint":""} Apr 21 16:02:13.198639 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.198618 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 16:02:13.198639 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.198644 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 16:02:13.247848 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.247821 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:13.248000 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:13.247967 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:13.469675 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.469590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" event={"ID":"2eb59892-b9e2-465a-8cee-e0ff36f2f66f","Type":"ContainerStarted","Data":"9dc2cc4d1ffbe2bac735d93911bb30f392e1ff5514b1f6255ba735114fcc0239"} Apr 21 16:02:13.470964 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.470935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gdx82" event={"ID":"ed5b38e2-9d42-4cbc-8f24-8c6444ebb5a1","Type":"ContainerStarted","Data":"ef4c3ff43e771a61ecddb52345fc4b9c2d1e20f57c96097ebbd80d16bbd73e0f"} Apr 21 16:02:13.472871 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.472837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" event={"ID":"7e9ac9a5ecee46125a37b1b7d1e8dc22","Type":"ContainerStarted","Data":"427293f6ccc3e1671a433f5e3ba2de87704122ad140caa857996da2aaa476f97"} Apr 21 16:02:13.490662 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.490607 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gdx82" podStartSLOduration=3.839356413 podStartE2EDuration="21.490590563s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.447114046 +0000 UTC m=+1.823295835" lastFinishedPulling="2026-04-21 16:02:11.098348181 +0000 UTC m=+19.474529985" observedRunningTime="2026-04-21 16:02:13.489908978 +0000 UTC m=+21.866090791" watchObservedRunningTime="2026-04-21 16:02:13.490590563 +0000 UTC m=+21.866772375" Apr 21 16:02:13.490962 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.490927 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-r68ws" podStartSLOduration=3.90108261 podStartE2EDuration="21.490920104s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.509789456 +0000 UTC m=+1.885971245" lastFinishedPulling="2026-04-21 16:02:11.099626941 +0000 UTC m=+19.475808739" observedRunningTime="2026-04-21 16:02:12.566328243 +0000 UTC m=+20.942510055" watchObservedRunningTime="2026-04-21 16:02:13.490920104 +0000 UTC m=+21.867101916" Apr 21 16:02:13.514629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:13.514582 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" podStartSLOduration=21.51457028 podStartE2EDuration="21.51457028s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:13.513796072 +0000 UTC m=+21.889977882" watchObservedRunningTime="2026-04-21 16:02:13.51457028 +0000 UTC m=+21.890752091" Apr 21 16:02:14.247940 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:14.247691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:14.248129 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:14.248020 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:14.476459 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:14.476419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" event={"ID":"2eb59892-b9e2-465a-8cee-e0ff36f2f66f","Type":"ContainerStarted","Data":"b7b0ea36d98e09f792e30cf5ebd1116d044c3acdc6bbec454192bd78709b18b5"} Apr 21 16:02:14.479616 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:14.479594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:02:14.479938 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:14.479911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"e9881ccb24c5510af8f110086bfeadc60745a1f993736db6982afbfd418051da"} Apr 21 16:02:15.247423 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:15.247390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:15.247604 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:15.247493 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:15.472199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:15.472166 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:02:15.472842 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:15.472819 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:02:15.481477 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:15.481411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:02:15.481912 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:15.481799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6gg6g" Apr 21 16:02:15.492246 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:15.492207 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7pwjv" podStartSLOduration=3.108105735 podStartE2EDuration="23.492193887s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.477429289 +0000 UTC m=+1.853611078" lastFinishedPulling="2026-04-21 16:02:13.861517426 +0000 UTC m=+22.237699230" observedRunningTime="2026-04-21 16:02:14.523385164 +0000 UTC m=+22.899566974" watchObservedRunningTime="2026-04-21 16:02:15.492193887 +0000 UTC m=+23.868375724" Apr 21 16:02:16.247595 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:16.247508 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:16.247746 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:16.247653 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:17.247744 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.247582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:17.248258 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:17.247815 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:17.485906 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.485832 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c002ed-ea31-4b63-a340-208a5f1f28e3" containerID="8ee85ae13094b8339a61bf9813d898676e09fc67b499b62d8391df01e1706e4d" exitCode=0 Apr 21 16:02:17.485997 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.485920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerDied","Data":"8ee85ae13094b8339a61bf9813d898676e09fc67b499b62d8391df01e1706e4d"} Apr 21 16:02:17.489205 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.489187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:02:17.489552 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.489530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"79fed36c346357e2dce9cb78ef1f5036fd0061f6a5273a910d6873bbe27c621c"} Apr 21 16:02:17.489963 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.489941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:02:17.489963 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.489964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:02:17.490119 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.490104 2576 scope.go:117] "RemoveContainer" containerID="798d8716b689488a2c0637d9cb075ccd29370af7c449fbceaa35df9b8cbd45f0" Apr 21 16:02:17.504440 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:17.504423 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:02:18.247308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.247285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:18.247402 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:18.247383 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:18.493053 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.492971 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c002ed-ea31-4b63-a340-208a5f1f28e3" containerID="696f544bf971e0ad25770063aeb15215de0226c4c7e4e3b183b0f42beeb728bc" exitCode=0 Apr 21 16:02:18.493431 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.493061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerDied","Data":"696f544bf971e0ad25770063aeb15215de0226c4c7e4e3b183b0f42beeb728bc"} Apr 21 16:02:18.496520 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.496505 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:02:18.496963 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.496921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" event={"ID":"eb33d6a3-42f4-4598-8722-83e1ccb59eae","Type":"ContainerStarted","Data":"2f005ece66a7ba9e8a3a3b92cfe24c372c4a03384db31785dd43fcbe289911a8"} Apr 21 16:02:18.497331 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.497289 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:02:18.510968 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.510950 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:02:18.552550 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.552348 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" podStartSLOduration=8.677933205 podStartE2EDuration="26.552333457s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.262774275 +0000 UTC m=+1.638956064" lastFinishedPulling="2026-04-21 16:02:11.137174525 +0000 UTC m=+19.513356316" observedRunningTime="2026-04-21 16:02:18.552044392 +0000 UTC m=+26.928226204" watchObservedRunningTime="2026-04-21 16:02:18.552333457 +0000 UTC m=+26.928515268" Apr 21 16:02:18.819207 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.819134 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pgrtm"] Apr 21 16:02:18.819337 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.819238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:18.819337 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:18.819307 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:18.821702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.821680 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-clkll"] Apr 21 16:02:18.821808 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:18.821783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:18.821899 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:18.821883 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:19.500725 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:19.500662 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c002ed-ea31-4b63-a340-208a5f1f28e3" containerID="30901065c73a7ef66444db687de9d08e037615e2af0cadd398658e80cf7f13e4" exitCode=0 Apr 21 16:02:19.501187 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:19.500736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerDied","Data":"30901065c73a7ef66444db687de9d08e037615e2af0cadd398658e80cf7f13e4"} Apr 21 16:02:20.248002 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:20.247934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:20.248002 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:20.247954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:20.248222 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:20.248061 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:20.248222 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:20.248128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:22.248346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:22.248318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:22.248896 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:22.248411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:22.248896 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:22.248437 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:22.248896 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:22.248471 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:24.247404 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.247177 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:24.247771 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.247233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:24.247771 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.247526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:02:24.247771 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.247575 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pgrtm" podUID="9a9fc212-5dd1-4633-82ec-c0c7aeee954f" Apr 21 16:02:24.433740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.433710 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeReady" Apr 21 16:02:24.433938 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.433826 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 16:02:24.518355 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.518325 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ffpt6"] Apr 21 16:02:24.572152 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.572127 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rsggj"] Apr 21 16:02:24.572288 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.572222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.575325 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.575303 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2l66p\"" Apr 21 16:02:24.575431 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.575412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 16:02:24.575923 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.575904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 16:02:24.604168 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.604146 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rsggj"] Apr 21 16:02:24.604263 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.604175 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ffpt6"] Apr 21 16:02:24.604317 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.604281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:24.606832 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.606814 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 16:02:24.606936 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.606848 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 16:02:24.606996 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.606964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76wnm\"" Apr 21 16:02:24.607284 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.607268 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 16:02:24.676710 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.676682 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b713efcd-8ac3-49bc-a307-d7919be3e7ba-config-volume\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.676828 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.676720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.676828 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.676818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b713efcd-8ac3-49bc-a307-d7919be3e7ba-tmp-dir\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.676949 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.676845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkjlv\" (UniqueName: \"kubernetes.io/projected/b713efcd-8ac3-49bc-a307-d7919be3e7ba-kube-api-access-rkjlv\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.778123 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.778035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b713efcd-8ac3-49bc-a307-d7919be3e7ba-tmp-dir\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.778123 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.778074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkjlv\" (UniqueName: \"kubernetes.io/projected/b713efcd-8ac3-49bc-a307-d7919be3e7ba-kube-api-access-rkjlv\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.778279 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.778123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:24.778279 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.778147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmhc6\" (UniqueName: \"kubernetes.io/projected/e01c7fac-d92f-4341-ad18-e2c502d62462-kube-api-access-pmhc6\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:24.778279 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.778183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b713efcd-8ac3-49bc-a307-d7919be3e7ba-config-volume\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.778279 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.778208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.778429 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.778301 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:24.778429 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.778377 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:02:25.2783549 +0000 UTC m=+33.654536700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:02:24.778773 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.778747 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b713efcd-8ac3-49bc-a307-d7919be3e7ba-config-volume\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.790648 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.790628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkjlv\" (UniqueName: \"kubernetes.io/projected/b713efcd-8ac3-49bc-a307-d7919be3e7ba-kube-api-access-rkjlv\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.795537 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.795516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b713efcd-8ac3-49bc-a307-d7919be3e7ba-tmp-dir\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:24.879121 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.879096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:24.879234 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.879160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:24.879234 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.879185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmhc6\" (UniqueName: \"kubernetes.io/projected/e01c7fac-d92f-4341-ad18-e2c502d62462-kube-api-access-pmhc6\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:24.879336 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.879243 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:24.879336 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.879315 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:02:56.879296615 +0000 UTC m=+65.255478416 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:24.879336 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.879313 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:24.879494 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.879369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:25.37935525 +0000 UTC m=+33.755537062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:02:24.890408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.890384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmhc6\" (UniqueName: \"kubernetes.io/projected/e01c7fac-d92f-4341-ad18-e2c502d62462-kube-api-access-pmhc6\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:24.980327 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:24.980306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:24.980422 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.980411 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:02:24.980468 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.980425 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:02:24.980468 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.980434 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ck9m8 for pod openshift-network-diagnostics/network-check-target-pgrtm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:24.980533 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:24.980474 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8 podName:9a9fc212-5dd1-4633-82ec-c0c7aeee954f nodeName:}" failed. No retries permitted until 2026-04-21 16:02:56.98046138 +0000 UTC m=+65.356643168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ck9m8" (UniqueName: "kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8") pod "network-check-target-pgrtm" (UID: "9a9fc212-5dd1-4633-82ec-c0c7aeee954f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:25.282525 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:25.282495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:25.282851 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:25.282610 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:25.282851 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:25.282658 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:02:26.282645939 +0000 UTC m=+34.658827727 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:02:25.383481 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:25.383449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:25.383606 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:25.383584 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:25.383659 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:25.383643 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:26.383629515 +0000 UTC m=+34.759811303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:02:26.247839 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.247800 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:26.248030 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.247803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:26.252346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.252322 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 16:02:26.252688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.252666 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cz78x\"" Apr 21 16:02:26.252801 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.252697 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 16:02:26.253438 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.253417 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 16:02:26.253438 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.253431 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jk8gf\"" Apr 21 16:02:26.288834 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.288811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:26.289121 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:26.288952 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:26.289121 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:26.289023 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:02:28.28900635 +0000 UTC m=+36.665188138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:02:26.390086 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.390058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:26.393342 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:26.390587 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:26.393342 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:26.390670 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:28.390649194 +0000 UTC m=+36.766831005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:02:26.515483 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.515425 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c002ed-ea31-4b63-a340-208a5f1f28e3" containerID="b652a515eb8f7747dd6d9dce44e4eaa81d78faf1104ae09e161900b019c5cac3" exitCode=0 Apr 21 16:02:26.515483 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:26.515457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerDied","Data":"b652a515eb8f7747dd6d9dce44e4eaa81d78faf1104ae09e161900b019c5cac3"} Apr 21 16:02:27.519959 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:27.519924 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c002ed-ea31-4b63-a340-208a5f1f28e3" containerID="5304b58fc011f0a6f1f9474471b5d934d550c4dae10a6875a485f4695ba32e77" exitCode=0 Apr 21 16:02:27.520415 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:27.519976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerDied","Data":"5304b58fc011f0a6f1f9474471b5d934d550c4dae10a6875a485f4695ba32e77"} Apr 21 16:02:28.303183 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:28.303151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:28.303315 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:28.303283 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:28.303359 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:28.303341 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:02:32.303324437 +0000 UTC m=+40.679506226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:02:28.404197 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:28.404172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:28.404292 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:28.404281 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:28.404336 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:28.404322 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:32.404310759 +0000 UTC m=+40.780492547 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:02:28.524715 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:28.524692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" event={"ID":"c5c002ed-ea31-4b63-a340-208a5f1f28e3","Type":"ContainerStarted","Data":"3374aa8732be7d2ff28d580383420039780c5f7d56186f057ea9549cfc800c78"} Apr 21 16:02:28.550845 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:28.550808 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rxw6j" podStartSLOduration=4.416194364 podStartE2EDuration="36.550796272s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:01:53.424605216 +0000 UTC m=+1.800787009" lastFinishedPulling="2026-04-21 16:02:25.559207129 +0000 UTC m=+33.935388917" observedRunningTime="2026-04-21 16:02:28.54944723 +0000 UTC m=+36.925629039" watchObservedRunningTime="2026-04-21 16:02:28.550796272 +0000 UTC m=+36.926978081" Apr 21 16:02:32.330051 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:32.329899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:32.330051 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:32.330031 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:32.330432 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:32.330087 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:02:40.330070632 +0000 UTC m=+48.706252440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:02:32.430803 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:32.430780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:32.430948 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:32.430932 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:32.431016 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:32.430993 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:40.430975016 +0000 UTC m=+48.807156823 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:02:40.383292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:40.383261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:40.383654 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:40.383368 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:40.383654 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:40.383416 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:02:56.383403709 +0000 UTC m=+64.759585497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:02:40.484073 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:40.484052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:40.484181 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:40.484156 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:40.484217 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:40.484194 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:56.48418427 +0000 UTC m=+64.860366058 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:02:50.512623 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:50.512598 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7dm6j" Apr 21 16:02:56.390513 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:56.390475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:02:56.391026 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:56.390657 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:56.391026 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:56.390757 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:03:28.390737992 +0000 UTC m=+96.766919780 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:02:56.491008 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:56.490980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:02:56.491133 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:56.491085 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:56.491173 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:56.491146 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:28.491131109 +0000 UTC m=+96.867312916 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:02:56.892829 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:56.892799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:02:56.895408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:56.895389 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 16:02:56.903771 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:56.903751 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 16:02:56.903852 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:02:56.903802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:04:00.903789508 +0000 UTC m=+129.279971296 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : secret "metrics-daemon-secret" not found Apr 21 16:02:56.993472 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:56.993448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:56.996240 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:56.996221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 16:02:57.006988 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:57.006966 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 16:02:57.018942 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:57.018914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck9m8\" (UniqueName: \"kubernetes.io/projected/9a9fc212-5dd1-4633-82ec-c0c7aeee954f-kube-api-access-ck9m8\") pod \"network-check-target-pgrtm\" (UID: \"9a9fc212-5dd1-4633-82ec-c0c7aeee954f\") " pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:57.166775 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:57.166713 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jk8gf\"" Apr 21 16:02:57.174253 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:57.174228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:02:57.377548 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:57.377513 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pgrtm"] Apr 21 16:02:57.382392 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:02:57.382359 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9fc212_5dd1_4633_82ec_c0c7aeee954f.slice/crio-5ab43cca7c8bb83fd439a81b9b2922ec92f4d7a32de40e35eee6984ed60a5479 WatchSource:0}: Error finding container 5ab43cca7c8bb83fd439a81b9b2922ec92f4d7a32de40e35eee6984ed60a5479: Status 404 returned error can't find the container with id 5ab43cca7c8bb83fd439a81b9b2922ec92f4d7a32de40e35eee6984ed60a5479 Apr 21 16:02:57.573959 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:02:57.573901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pgrtm" event={"ID":"9a9fc212-5dd1-4633-82ec-c0c7aeee954f","Type":"ContainerStarted","Data":"5ab43cca7c8bb83fd439a81b9b2922ec92f4d7a32de40e35eee6984ed60a5479"} Apr 21 16:03:00.581021 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:00.580933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pgrtm" event={"ID":"9a9fc212-5dd1-4633-82ec-c0c7aeee954f","Type":"ContainerStarted","Data":"12b2a349bcf8ef74b412a417e503f0693f788d522d9bdb194287c15b4492b02c"} Apr 21 16:03:00.581427 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:00.581062 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:03:28.396357 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:28.396317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:03:28.396701 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:28.396462 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:03:28.396701 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:28.396530 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls podName:b713efcd-8ac3-49bc-a307-d7919be3e7ba nodeName:}" failed. No retries permitted until 2026-04-21 16:04:32.396511856 +0000 UTC m=+160.772693646 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls") pod "dns-default-ffpt6" (UID: "b713efcd-8ac3-49bc-a307-d7919be3e7ba") : secret "dns-default-metrics-tls" not found Apr 21 16:03:28.497310 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:28.497281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:03:28.497425 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:28.497411 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:03:28.497482 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:28.497473 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert podName:e01c7fac-d92f-4341-ad18-e2c502d62462 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:32.497457705 +0000 UTC m=+160.873639494 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert") pod "ingress-canary-rsggj" (UID: "e01c7fac-d92f-4341-ad18-e2c502d62462") : secret "canary-serving-cert" not found Apr 21 16:03:31.586038 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:31.585996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pgrtm" Apr 21 16:03:31.611168 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:31.611123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pgrtm" podStartSLOduration=96.834913257 podStartE2EDuration="1m39.611110754s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:02:57.384317517 +0000 UTC m=+65.760499305" lastFinishedPulling="2026-04-21 16:03:00.160515001 +0000 UTC m=+68.536696802" observedRunningTime="2026-04-21 16:03:00.616344799 +0000 UTC m=+68.992526608" watchObservedRunningTime="2026-04-21 16:03:31.611110754 +0000 UTC m=+99.987292644" Apr 21 16:03:51.908262 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.908233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24"] Apr 21 16:03:51.910190 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.910174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:51.913049 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.913026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 16:03:51.915469 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.915449 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 16:03:51.915577 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.915452 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-f65lv\"" Apr 21 16:03:51.915577 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.915482 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 16:03:51.919898 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.919882 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 16:03:51.933017 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.932995 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24"] Apr 21 16:03:51.957439 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.957417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d873a023-c5df-43cd-9008-fb0ac22d99a0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:51.957530 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.957446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlfj\" (UniqueName: \"kubernetes.io/projected/d873a023-c5df-43cd-9008-fb0ac22d99a0-kube-api-access-tjlfj\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:51.957573 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:51.957535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:52.041205 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.041179 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5d9c9b646-wbgwm"] Apr 21 16:03:52.043214 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.043195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.051900 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.051883 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 16:03:52.052584 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.052565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 16:03:52.052646 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.052569 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.053088 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.053071 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 16:03:52.053184 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.053167 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 16:03:52.053244 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.053183 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-fwtvp\"" Apr 21 16:03:52.053742 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.053727 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.057832 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.057814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:52.057902 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.057851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d873a023-c5df-43cd-9008-fb0ac22d99a0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:52.057902 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.057897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlfj\" (UniqueName: \"kubernetes.io/projected/d873a023-c5df-43cd-9008-fb0ac22d99a0-kube-api-access-tjlfj\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:52.057989 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.057977 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:52.058044 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.058036 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls podName:d873a023-c5df-43cd-9008-fb0ac22d99a0 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:52.558021123 +0000 UTC m=+120.934202912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wtt24" (UID: "d873a023-c5df-43cd-9008-fb0ac22d99a0") : secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:52.058551 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.058533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d873a023-c5df-43cd-9008-fb0ac22d99a0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:52.108376 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.108338 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5d9c9b646-wbgwm"] Apr 21 16:03:52.110049 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.110028 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.123600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.123574 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.130082 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.130055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlfj\" (UniqueName: \"kubernetes.io/projected/d873a023-c5df-43cd-9008-fb0ac22d99a0-kube-api-access-tjlfj\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:52.158799 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.158702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-default-certificate\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.158799 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.158753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xz66\" (UniqueName: \"kubernetes.io/projected/60362f96-0ab5-45f5-b480-008e39e6ac85-kube-api-access-8xz66\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.159008 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.158847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-stats-auth\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.159008 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.158911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.159008 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.158928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.185668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.185640 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5"] Apr 21 16:03:52.187558 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.187543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:52.191207 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.191182 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-96r25\"" Apr 21 16:03:52.191340 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.191186 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.191781 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.191767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 16:03:52.193055 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.193037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.208408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.208386 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5"] Apr 21 16:03:52.259689 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.259666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-stats-auth\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.259821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.259708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.259821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.259726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.259821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.259756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:52.259821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.259783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9x4\" (UniqueName: \"kubernetes.io/projected/4c3d83a5-fc73-4741-b1fe-387863ed66de-kube-api-access-dv9x4\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:52.260029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.259833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-default-certificate\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.260029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.259891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xz66\" (UniqueName: \"kubernetes.io/projected/60362f96-0ab5-45f5-b480-008e39e6ac85-kube-api-access-8xz66\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.264447 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.264426 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 16:03:52.264690 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.264667 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 16:03:52.264822 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.264726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 16:03:52.264822 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.264778 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 16:03:52.270616 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.270600 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 16:03:52.270681 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.270654 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:52.770638326 +0000 UTC m=+121.146820118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : configmap references non-existent config key: service-ca.crt Apr 21 16:03:52.270681 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.270678 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:52.770667515 +0000 UTC m=+121.146849307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : secret "router-metrics-certs-default" not found Apr 21 16:03:52.272729 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.272701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-default-certificate\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.272821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.272777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-stats-auth\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.284867 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.284832 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r"] Apr 21 16:03:52.286756 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.286742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.293558 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.293539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 16:03:52.293647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.293578 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 16:03:52.293647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.293602 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.293647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.293638 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.293946 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.293930 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hbjzv\"" Apr 21 16:03:52.305440 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.305422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r"] Apr 21 16:03:52.310990 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.310974 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.322002 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.321985 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.331723 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.331704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xz66\" (UniqueName: \"kubernetes.io/projected/60362f96-0ab5-45f5-b480-008e39e6ac85-kube-api-access-8xz66\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.360525 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.360502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-serving-cert\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.360619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.360538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pllmf\" (UniqueName: \"kubernetes.io/projected/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-kube-api-access-pllmf\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.360619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.360591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-config\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.360697 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.360631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:52.360697 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.360647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9x4\" (UniqueName: \"kubernetes.io/projected/4c3d83a5-fc73-4741-b1fe-387863ed66de-kube-api-access-dv9x4\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:52.360783 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.360766 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 16:03:52.360838 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.360829 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls podName:4c3d83a5-fc73-4741-b1fe-387863ed66de nodeName:}" failed. No retries permitted until 2026-04-21 16:03:52.860814675 +0000 UTC m=+121.236996468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5m7z5" (UID: "4c3d83a5-fc73-4741-b1fe-387863ed66de") : secret "samples-operator-tls" not found Apr 21 16:03:52.388128 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.388103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9x4\" (UniqueName: \"kubernetes.io/projected/4c3d83a5-fc73-4741-b1fe-387863ed66de-kube-api-access-dv9x4\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:52.460955 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.460899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pllmf\" (UniqueName: \"kubernetes.io/projected/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-kube-api-access-pllmf\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.460955 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.460952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-config\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.461091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.461045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-serving-cert\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.461464 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.461442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-config\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.462972 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.462949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-serving-cert\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.470706 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.470684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pllmf\" (UniqueName: \"kubernetes.io/projected/3fb650da-0a2a-4d1a-b6d9-01cfaef03e45-kube-api-access-pllmf\") pod \"service-ca-operator-d6fc45fc5-w4j6r\" (UID: \"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.562021 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.561995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:52.562143 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.562127 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:52.562196 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.562187 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls podName:d873a023-c5df-43cd-9008-fb0ac22d99a0 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:53.562172212 +0000 UTC m=+121.938354001 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wtt24" (UID: "d873a023-c5df-43cd-9008-fb0ac22d99a0") : secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:52.595837 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.595815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" Apr 21 16:03:52.723606 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.723537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r"] Apr 21 16:03:52.726524 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:03:52.726495 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fb650da_0a2a_4d1a_b6d9_01cfaef03e45.slice/crio-5f1ad9d16f2c69ef553ae7dfdcdc3959052cf9be9af10e2ca71bf4eba5a0c242 WatchSource:0}: Error finding container 5f1ad9d16f2c69ef553ae7dfdcdc3959052cf9be9af10e2ca71bf4eba5a0c242: Status 404 returned error can't find the container with id 5f1ad9d16f2c69ef553ae7dfdcdc3959052cf9be9af10e2ca71bf4eba5a0c242 Apr 21 16:03:52.864442 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.864404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.864442 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.864449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:52.864692 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:52.864469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:52.864692 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.864561 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 16:03:52.864692 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.864597 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:53.864576714 +0000 UTC m=+122.240758505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : configmap references non-existent config key: service-ca.crt Apr 21 16:03:52.864692 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.864623 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:53.864614994 +0000 UTC m=+122.240796786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : secret "router-metrics-certs-default" not found Apr 21 16:03:52.864692 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.864604 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 16:03:52.864692 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:52.864682 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls podName:4c3d83a5-fc73-4741-b1fe-387863ed66de nodeName:}" failed. No retries permitted until 2026-04-21 16:03:53.864666672 +0000 UTC m=+122.240848466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5m7z5" (UID: "4c3d83a5-fc73-4741-b1fe-387863ed66de") : secret "samples-operator-tls" not found Apr 21 16:03:53.570152 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:53.570118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:53.570579 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:53.570176 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:53.570579 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:53.570241 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls podName:d873a023-c5df-43cd-9008-fb0ac22d99a0 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:55.570225496 +0000 UTC m=+123.946407296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wtt24" (UID: "d873a023-c5df-43cd-9008-fb0ac22d99a0") : secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:53.675228 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:53.675189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" event={"ID":"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45","Type":"ContainerStarted","Data":"5f1ad9d16f2c69ef553ae7dfdcdc3959052cf9be9af10e2ca71bf4eba5a0c242"} Apr 21 16:03:53.872246 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:53.872160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:53.872246 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:53.872211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:53.872466 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:53.872315 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 16:03:53.872466 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:53.872340 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:55.872319054 +0000 UTC m=+124.248500844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : configmap references non-existent config key: service-ca.crt Apr 21 16:03:53.872466 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:53.872374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:53.872602 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:53.872475 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 16:03:53.872602 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:53.872513 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:55.872489915 +0000 UTC m=+124.248671720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : secret "router-metrics-certs-default" not found Apr 21 16:03:53.872602 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:53.872540 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls podName:4c3d83a5-fc73-4741-b1fe-387863ed66de nodeName:}" failed. No retries permitted until 2026-04-21 16:03:55.872529112 +0000 UTC m=+124.248710901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5m7z5" (UID: "4c3d83a5-fc73-4741-b1fe-387863ed66de") : secret "samples-operator-tls" not found Apr 21 16:03:54.678578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:54.678548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" event={"ID":"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45","Type":"ContainerStarted","Data":"0631cdef1fcebfde570ff2c7cf5c489878c592f2de858c16881534f81d37146a"} Apr 21 16:03:54.706377 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:54.706330 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" podStartSLOduration=0.887287246 podStartE2EDuration="2.706317252s" podCreationTimestamp="2026-04-21 16:03:52 +0000 UTC" firstStartedPulling="2026-04-21 16:03:52.728236269 +0000 UTC m=+121.104418057" lastFinishedPulling="2026-04-21 16:03:54.54726627 +0000 UTC m=+122.923448063" observedRunningTime="2026-04-21 16:03:54.701000863 +0000 UTC m=+123.077182673" watchObservedRunningTime="2026-04-21 16:03:54.706317252 +0000 UTC m=+123.082499062" Apr 21 16:03:55.586138 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:55.586102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:55.586310 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:55.586234 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:55.586310 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:55.586291 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls podName:d873a023-c5df-43cd-9008-fb0ac22d99a0 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:59.586274796 +0000 UTC m=+127.962456594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wtt24" (UID: "d873a023-c5df-43cd-9008-fb0ac22d99a0") : secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:55.888494 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:55.888452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:55.888494 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:55.888487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:55.888893 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:55.888512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:55.888893 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:55.888597 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 16:03:55.888893 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:55.888606 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:59.888587365 +0000 UTC m=+128.264769160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : configmap references non-existent config key: service-ca.crt Apr 21 16:03:55.888893 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:55.888642 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 16:03:55.888893 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:55.888659 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:59.888646401 +0000 UTC m=+128.264828190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : secret "router-metrics-certs-default" not found Apr 21 16:03:55.888893 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:55.888688 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls podName:4c3d83a5-fc73-4741-b1fe-387863ed66de nodeName:}" failed. No retries permitted until 2026-04-21 16:03:59.888677074 +0000 UTC m=+128.264858866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5m7z5" (UID: "4c3d83a5-fc73-4741-b1fe-387863ed66de") : secret "samples-operator-tls" not found Apr 21 16:03:56.200577 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.200504 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-g95tn"] Apr 21 16:03:56.202610 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.202594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:56.212448 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.212429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 16:03:56.213587 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.213572 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-cdlhn\"" Apr 21 16:03:56.213811 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.213787 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 16:03:56.236729 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.236706 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-g95tn"] Apr 21 16:03:56.291990 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.291968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:56.292075 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.291999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:56.392434 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.392400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:56.392517 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.392446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:56.392685 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:56.392629 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 16:03:56.392787 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:56.392734 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert podName:ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:56.892712627 +0000 UTC m=+125.268894416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-g95tn" (UID: "ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4") : secret "networking-console-plugin-cert" not found Apr 21 16:03:56.393119 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.393098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:56.896809 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:56.896765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:56.897215 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:56.896926 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 16:03:56.897215 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:56.897002 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert podName:ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:57.896985729 +0000 UTC m=+126.273167518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-g95tn" (UID: "ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4") : secret "networking-console-plugin-cert" not found Apr 21 16:03:57.903080 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:57.903044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:57.903575 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:57.903190 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 16:03:57.903575 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:57.903264 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert podName:ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:59.90324667 +0000 UTC m=+128.279428463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-g95tn" (UID: "ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4") : secret "networking-console-plugin-cert" not found Apr 21 16:03:59.586162 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:59.586135 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qdd9v_3debd313-0085-4668-87ee-27bfcceec0f2/dns-node-resolver/0.log" Apr 21 16:03:59.616457 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:59.616424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:03:59.616588 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.616575 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:59.616657 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.616644 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls podName:d873a023-c5df-43cd-9008-fb0ac22d99a0 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:07.616629663 +0000 UTC m=+135.992811452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wtt24" (UID: "d873a023-c5df-43cd-9008-fb0ac22d99a0") : secret "cluster-monitoring-operator-tls" not found Apr 21 16:03:59.918833 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:59.918800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:59.918833 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:59.918835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:03:59.919023 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:59.918896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:03:59.919023 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:03:59.918942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:03:59.919023 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.918972 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 16:03:59.919023 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.919018 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 16:03:59.919146 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.919054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:07.9190329 +0000 UTC m=+136.295214710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : secret "router-metrics-certs-default" not found Apr 21 16:03:59.919146 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.919056 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 16:03:59.919146 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.919076 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:07.919063052 +0000 UTC m=+136.295244844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : configmap references non-existent config key: service-ca.crt Apr 21 16:03:59.919146 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.919088 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls podName:4c3d83a5-fc73-4741-b1fe-387863ed66de nodeName:}" failed. No retries permitted until 2026-04-21 16:04:07.919082522 +0000 UTC m=+136.295264311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5m7z5" (UID: "4c3d83a5-fc73-4741-b1fe-387863ed66de") : secret "samples-operator-tls" not found Apr 21 16:03:59.919146 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:03:59.919105 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert podName:ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:03.919091798 +0000 UTC m=+132.295273587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-g95tn" (UID: "ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4") : secret "networking-console-plugin-cert" not found Apr 21 16:04:00.386463 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:00.386437 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r68ws_0983fe40-299f-4402-9953-2982f995ea8f/node-ca/0.log" Apr 21 16:04:00.926478 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:00.926431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:04:00.926841 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:00.926575 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 16:04:00.926841 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:00.926637 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs podName:b782d8e0-5d5e-4897-91b4-49f7049c7fac nodeName:}" failed. No retries permitted until 2026-04-21 16:06:02.92662168 +0000 UTC m=+251.302803473 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs") pod "network-metrics-daemon-clkll" (UID: "b782d8e0-5d5e-4897-91b4-49f7049c7fac") : secret "metrics-daemon-secret" not found Apr 21 16:04:03.948585 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:03.948548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:04:03.948953 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:03.948691 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 16:04:03.948953 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:03.948752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert podName:ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:11.948737125 +0000 UTC m=+140.324918913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-g95tn" (UID: "ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4") : secret "networking-console-plugin-cert" not found Apr 21 16:04:07.676986 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:07.676944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:04:07.677354 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:07.677092 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 16:04:07.677354 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:07.677156 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls podName:d873a023-c5df-43cd-9008-fb0ac22d99a0 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:23.677140001 +0000 UTC m=+152.053321789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wtt24" (UID: "d873a023-c5df-43cd-9008-fb0ac22d99a0") : secret "cluster-monitoring-operator-tls" not found Apr 21 16:04:07.980372 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:07.980295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:07.980372 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:07.980328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:07.980372 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:07.980345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:04:07.980646 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:07.980436 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle podName:60362f96-0ab5-45f5-b480-008e39e6ac85 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:23.980416978 +0000 UTC m=+152.356598767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle") pod "router-default-5d9c9b646-wbgwm" (UID: "60362f96-0ab5-45f5-b480-008e39e6ac85") : configmap references non-existent config key: service-ca.crt Apr 21 16:04:07.982622 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:07.982597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c3d83a5-fc73-4741-b1fe-387863ed66de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5m7z5\" (UID: \"4c3d83a5-fc73-4741-b1fe-387863ed66de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:04:07.982772 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:07.982750 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60362f96-0ab5-45f5-b480-008e39e6ac85-metrics-certs\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:08.098869 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:08.098833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" Apr 21 16:04:08.232904 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:08.232882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5"] Apr 21 16:04:08.708332 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:08.708295 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" event={"ID":"4c3d83a5-fc73-4741-b1fe-387863ed66de","Type":"ContainerStarted","Data":"420a281ce81c9fd7a6a9d5c41d166feade531064ad1cc7bc547a5628551f03d9"} Apr 21 16:04:10.713685 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:10.713640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" event={"ID":"4c3d83a5-fc73-4741-b1fe-387863ed66de","Type":"ContainerStarted","Data":"f7f1cb4a4b155171ba525bcb4d605c641901ddf8be5587ccdbd1c7bd87fc3039"} Apr 21 16:04:10.713685 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:10.713682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" event={"ID":"4c3d83a5-fc73-4741-b1fe-387863ed66de","Type":"ContainerStarted","Data":"cf7013fc666978f52257c8c96f017bbcc968a282d2d91bec41ca48526212eaa3"} Apr 21 16:04:10.747278 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:10.747230 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5m7z5" podStartSLOduration=17.1931019 podStartE2EDuration="18.747217263s" podCreationTimestamp="2026-04-21 16:03:52 +0000 UTC" firstStartedPulling="2026-04-21 16:04:08.276264821 +0000 UTC m=+136.652446613" lastFinishedPulling="2026-04-21 16:04:09.830380184 +0000 UTC m=+138.206561976" observedRunningTime="2026-04-21 16:04:10.746960423 +0000 UTC m=+139.123142233" watchObservedRunningTime="2026-04-21 16:04:10.747217263 +0000 UTC m=+139.123399073" Apr 21 16:04:12.009609 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:12.009573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:04:12.012537 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:12.012482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-g95tn\" (UID: \"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:04:12.111258 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:12.111235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" Apr 21 16:04:12.228330 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:12.228304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-g95tn"] Apr 21 16:04:12.231007 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:12.230982 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca93eccd_f4c5_4bcf_a5f3_7521b23a86d4.slice/crio-fb501cf6cb53a8412259174a5fa24d62cb76f4bbade96d22933e509182e59f66 WatchSource:0}: Error finding container fb501cf6cb53a8412259174a5fa24d62cb76f4bbade96d22933e509182e59f66: Status 404 returned error can't find the container with id fb501cf6cb53a8412259174a5fa24d62cb76f4bbade96d22933e509182e59f66 Apr 21 16:04:12.718495 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:12.718459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" event={"ID":"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4","Type":"ContainerStarted","Data":"fb501cf6cb53a8412259174a5fa24d62cb76f4bbade96d22933e509182e59f66"} Apr 21 16:04:13.723764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:13.723688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" event={"ID":"ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4","Type":"ContainerStarted","Data":"6a7a7c1800f3186a6fe5e2eb747ae546695296a882cc12e3fe8edcf10c42a13d"} Apr 21 16:04:13.749476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:13.749432 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-g95tn" podStartSLOduration=16.589607667 podStartE2EDuration="17.749418668s" podCreationTimestamp="2026-04-21 16:03:56 +0000 UTC" firstStartedPulling="2026-04-21 16:04:12.23349852 +0000 UTC m=+140.609680314" lastFinishedPulling="2026-04-21 16:04:13.393309526 +0000 UTC m=+141.769491315" observedRunningTime="2026-04-21 16:04:13.749183233 +0000 UTC m=+142.125365043" watchObservedRunningTime="2026-04-21 16:04:13.749418668 +0000 UTC m=+142.125600525" Apr 21 16:04:23.685283 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:23.685237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:04:23.687786 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:23.687755 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d873a023-c5df-43cd-9008-fb0ac22d99a0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wtt24\" (UID: \"d873a023-c5df-43cd-9008-fb0ac22d99a0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:04:23.723390 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:23.723361 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-f65lv\"" Apr 21 16:04:23.729543 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:23.729520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" Apr 21 16:04:23.853054 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:23.853025 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24"] Apr 21 16:04:23.853176 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:23.853126 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd873a023_c5df_43cd_9008_fb0ac22d99a0.slice/crio-fdb59784d43b88db6c6a97d25d5b922b38e1a4b1a645804ff99bbe14a41de8af WatchSource:0}: Error finding container fdb59784d43b88db6c6a97d25d5b922b38e1a4b1a645804ff99bbe14a41de8af: Status 404 returned error can't find the container with id fdb59784d43b88db6c6a97d25d5b922b38e1a4b1a645804ff99bbe14a41de8af Apr 21 16:04:23.987216 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:23.987140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:23.987688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:23.987668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60362f96-0ab5-45f5-b480-008e39e6ac85-service-ca-bundle\") pod \"router-default-5d9c9b646-wbgwm\" (UID: \"60362f96-0ab5-45f5-b480-008e39e6ac85\") " pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:24.156716 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:24.156693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-fwtvp\"" Apr 21 16:04:24.161901 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:24.161874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:24.288684 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:24.288628 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5d9c9b646-wbgwm"] Apr 21 16:04:24.291653 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:24.291632 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60362f96_0ab5_45f5_b480_008e39e6ac85.slice/crio-b7e0648c747a78079951e8a0c2c583fd0be31dcc168aa7184e73cab9c93c1666 WatchSource:0}: Error finding container b7e0648c747a78079951e8a0c2c583fd0be31dcc168aa7184e73cab9c93c1666: Status 404 returned error can't find the container with id b7e0648c747a78079951e8a0c2c583fd0be31dcc168aa7184e73cab9c93c1666 Apr 21 16:04:24.754668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:24.754613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" event={"ID":"60362f96-0ab5-45f5-b480-008e39e6ac85","Type":"ContainerStarted","Data":"cfed8279b39b42683faa8c1eb4ebe2425036a7c5b28055d9c66bb05abf523a66"} Apr 21 16:04:24.754668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:24.754656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" event={"ID":"60362f96-0ab5-45f5-b480-008e39e6ac85","Type":"ContainerStarted","Data":"b7e0648c747a78079951e8a0c2c583fd0be31dcc168aa7184e73cab9c93c1666"} Apr 21 16:04:24.755969 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:24.755940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" event={"ID":"d873a023-c5df-43cd-9008-fb0ac22d99a0","Type":"ContainerStarted","Data":"fdb59784d43b88db6c6a97d25d5b922b38e1a4b1a645804ff99bbe14a41de8af"} Apr 21 16:04:24.778063 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:24.778013 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" podStartSLOduration=33.777995718 podStartE2EDuration="33.777995718s" podCreationTimestamp="2026-04-21 16:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:04:24.777545493 +0000 UTC m=+153.153727305" watchObservedRunningTime="2026-04-21 16:04:24.777995718 +0000 UTC m=+153.154177529" Apr 21 16:04:25.162879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:25.162817 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:25.166236 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:25.166205 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:25.760271 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:25.760235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" event={"ID":"d873a023-c5df-43cd-9008-fb0ac22d99a0","Type":"ContainerStarted","Data":"da416b3b3ca73f45a76404cdb2873b7af893e0cbcb5eee6b639f4b935330436b"} Apr 21 16:04:25.760724 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:25.760508 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:25.761784 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:25.761760 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5d9c9b646-wbgwm" Apr 21 16:04:25.786284 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:25.786222 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wtt24" podStartSLOduration=33.062332854 podStartE2EDuration="34.786202866s" podCreationTimestamp="2026-04-21 16:03:51 +0000 UTC" firstStartedPulling="2026-04-21 16:04:23.855081693 +0000 UTC m=+152.231263492" lastFinishedPulling="2026-04-21 16:04:25.578951703 +0000 UTC m=+153.955133504" observedRunningTime="2026-04-21 16:04:25.781185491 +0000 UTC m=+154.157367304" watchObservedRunningTime="2026-04-21 16:04:25.786202866 +0000 UTC m=+154.162384678" Apr 21 16:04:26.829088 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.829059 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84699598f-cjccv"] Apr 21 16:04:26.831093 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.831076 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.837691 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.837672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 16:04:26.838207 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.838192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vqm4s\"" Apr 21 16:04:26.840007 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.839993 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 16:04:26.840126 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.840106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 16:04:26.844674 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.844656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 16:04:26.849489 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.849465 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84699598f-cjccv"] Apr 21 16:04:26.910564 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.910533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-trusted-ca\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.910702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.910580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-image-registry-private-configuration\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.910702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.910602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkrwl\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-kube-api-access-wkrwl\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.910702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.910627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-bound-sa-token\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.910702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.910696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-ca-trust-extracted\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.911024 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.911002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-tls\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.911145 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.911035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-certificates\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:26.911145 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:26.911072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-installation-pull-secrets\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.011851 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.011824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-bound-sa-token\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.011985 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.011885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-ca-trust-extracted\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.011985 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.011927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-tls\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.011985 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.011943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-certificates\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.011985 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.011964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-installation-pull-secrets\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.012198 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.012101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-trusted-ca\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.012198 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.012138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-image-registry-private-configuration\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.012198 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.012161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkrwl\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-kube-api-access-wkrwl\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.012432 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.012396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-ca-trust-extracted\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.012994 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.012968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-certificates\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.013227 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.013204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-trusted-ca\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.014510 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.014487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-image-registry-private-configuration\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.014598 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.014494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-installation-pull-secrets\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.015342 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.015322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-tls\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.065100 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.065077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkrwl\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-kube-api-access-wkrwl\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.072902 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.072880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-bound-sa-token\") pod \"image-registry-84699598f-cjccv\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.094184 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.094121 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84699598f-cjccv"] Apr 21 16:04:27.094346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.094332 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.114264 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.114239 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rtv56"] Apr 21 16:04:27.118023 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.118003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.128584 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.128564 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 16:04:27.128684 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.128643 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 16:04:27.128753 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.128706 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 16:04:27.129994 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.129975 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 16:04:27.130594 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.130176 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rdjm8\"" Apr 21 16:04:27.138920 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.138891 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rtv56"] Apr 21 16:04:27.213892 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.213849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5v7p\" (UniqueName: \"kubernetes.io/projected/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-kube-api-access-w5v7p\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.214009 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.213974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.214009 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.214002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.214101 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.214044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-data-volume\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.214151 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.214132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-crio-socket\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.243627 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:27.243602 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94cbed55_afd1_4b53_b7c5_b3cc3490ade8.slice/crio-bf9e4201d7b6fb9f6ea21ea775b54b7faaabc19d4ae64582cc49fe8ddad5bd99 WatchSource:0}: Error finding container bf9e4201d7b6fb9f6ea21ea775b54b7faaabc19d4ae64582cc49fe8ddad5bd99: Status 404 returned error can't find the container with id bf9e4201d7b6fb9f6ea21ea775b54b7faaabc19d4ae64582cc49fe8ddad5bd99 Apr 21 16:04:27.251657 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.251622 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84699598f-cjccv"] Apr 21 16:04:27.315071 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.315050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-data-volume\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.315188 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.315096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-crio-socket\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.315188 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.315131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5v7p\" (UniqueName: \"kubernetes.io/projected/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-kube-api-access-w5v7p\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.315293 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.315233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-crio-socket\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.315293 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.315272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.315394 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.315293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.315394 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.315385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-data-volume\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.316610 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.316589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.317486 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.317459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.333176 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.333153 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5v7p\" (UniqueName: \"kubernetes.io/projected/7da6b8da-7bb2-4107-8411-5fe4591aa3d3-kube-api-access-w5v7p\") pod \"insights-runtime-extractor-rtv56\" (UID: \"7da6b8da-7bb2-4107-8411-5fe4591aa3d3\") " pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.426834 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.426809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rtv56" Apr 21 16:04:27.579541 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.579515 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rtv56"] Apr 21 16:04:27.581797 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:27.581768 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da6b8da_7bb2_4107_8411_5fe4591aa3d3.slice/crio-eae50cc42ead9b94ba24b9abd71899e72b83c50ed54b658638e8d580dbc6ec7e WatchSource:0}: Error finding container eae50cc42ead9b94ba24b9abd71899e72b83c50ed54b658638e8d580dbc6ec7e: Status 404 returned error can't find the container with id eae50cc42ead9b94ba24b9abd71899e72b83c50ed54b658638e8d580dbc6ec7e Apr 21 16:04:27.581980 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:27.581955 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ffpt6" podUID="b713efcd-8ac3-49bc-a307-d7919be3e7ba" Apr 21 16:04:27.613491 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:27.613465 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rsggj" podUID="e01c7fac-d92f-4341-ad18-e2c502d62462" Apr 21 16:04:27.766475 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.766386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84699598f-cjccv" event={"ID":"94cbed55-afd1-4b53-b7c5-b3cc3490ade8","Type":"ContainerStarted","Data":"456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1"} Apr 21 16:04:27.766475 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.766427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84699598f-cjccv" event={"ID":"94cbed55-afd1-4b53-b7c5-b3cc3490ade8","Type":"ContainerStarted","Data":"bf9e4201d7b6fb9f6ea21ea775b54b7faaabc19d4ae64582cc49fe8ddad5bd99"} Apr 21 16:04:27.766701 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.766543 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:27.767702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.767664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rtv56" event={"ID":"7da6b8da-7bb2-4107-8411-5fe4591aa3d3","Type":"ContainerStarted","Data":"c328e583aec6659c3eb81dee9b2270360d6cbde4e122d5a2184859b55c0944df"} Apr 21 16:04:27.767702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.767697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rtv56" event={"ID":"7da6b8da-7bb2-4107-8411-5fe4591aa3d3","Type":"ContainerStarted","Data":"eae50cc42ead9b94ba24b9abd71899e72b83c50ed54b658638e8d580dbc6ec7e"} Apr 21 16:04:27.767894 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.767704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ffpt6" Apr 21 16:04:27.789841 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:27.789801 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-84699598f-cjccv" podStartSLOduration=1.789789209 podStartE2EDuration="1.789789209s" podCreationTimestamp="2026-04-21 16:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:04:27.788236743 +0000 UTC m=+156.164418578" watchObservedRunningTime="2026-04-21 16:04:27.789789209 +0000 UTC m=+156.165970998" Apr 21 16:04:28.089370 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.089308 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c7f6dc48-vpmnt"] Apr 21 16:04:28.091680 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.091663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.096496 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.096473 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 16:04:28.096574 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.096479 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 16:04:28.096574 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.096542 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 16:04:28.096875 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.096839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 16:04:28.097270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.097246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 16:04:28.097839 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.097821 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 16:04:28.098759 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.098739 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 16:04:28.100966 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.100952 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9tkts\"" Apr 21 16:04:28.105248 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.105230 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c7f6dc48-vpmnt"] Apr 21 16:04:28.223201 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.223171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-serving-cert\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.223201 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.223204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjqxg\" (UniqueName: \"kubernetes.io/projected/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-kube-api-access-vjqxg\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.223406 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.223316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-config\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.223406 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.223354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-service-ca\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.223406 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.223380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-oauth-config\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.223406 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.223399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-oauth-serving-cert\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.324272 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-service-ca\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.324355 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-oauth-config\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.324355 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-oauth-serving-cert\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.324502 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-serving-cert\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.324551 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjqxg\" (UniqueName: \"kubernetes.io/projected/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-kube-api-access-vjqxg\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.324619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-config\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.325128 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-oauth-serving-cert\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.325128 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.324994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-service-ca\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.325302 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.325281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-config\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.326567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.326541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-oauth-config\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.326787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.326772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-serving-cert\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.344597 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.344550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjqxg\" (UniqueName: \"kubernetes.io/projected/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-kube-api-access-vjqxg\") pod \"console-7c7f6dc48-vpmnt\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.400734 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.400713 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:28.525164 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.525134 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c7f6dc48-vpmnt"] Apr 21 16:04:28.528613 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:28.528592 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb95460d_65fc_422a_bfe4_e7a0d9b427bf.slice/crio-31a9de4d8927344facb2e40abd985d7d2fc4b8239f8d45b5296ae2165dd98355 WatchSource:0}: Error finding container 31a9de4d8927344facb2e40abd985d7d2fc4b8239f8d45b5296ae2165dd98355: Status 404 returned error can't find the container with id 31a9de4d8927344facb2e40abd985d7d2fc4b8239f8d45b5296ae2165dd98355 Apr 21 16:04:28.770624 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.770593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7f6dc48-vpmnt" event={"ID":"eb95460d-65fc-422a-bfe4-e7a0d9b427bf","Type":"ContainerStarted","Data":"31a9de4d8927344facb2e40abd985d7d2fc4b8239f8d45b5296ae2165dd98355"} Apr 21 16:04:28.772038 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:28.772016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rtv56" event={"ID":"7da6b8da-7bb2-4107-8411-5fe4591aa3d3","Type":"ContainerStarted","Data":"91198e0a2b3f09bdab3779c2b513f6c115cb48ae986600555e2748d163b0bab0"} Apr 21 16:04:29.259066 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:29.259020 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-clkll" podUID="b782d8e0-5d5e-4897-91b4-49f7049c7fac" Apr 21 16:04:30.347240 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.347157 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-b9rvv"] Apr 21 16:04:30.349346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.349320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.352336 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.352312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 16:04:30.352760 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.352742 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5bjrk\"" Apr 21 16:04:30.353752 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.353734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 16:04:30.353879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.353761 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 16:04:30.378300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.378276 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-b9rvv"] Apr 21 16:04:30.441915 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.441887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vpk\" (UniqueName: \"kubernetes.io/projected/9d58a911-4968-4350-90b2-66ae9a65496d-kube-api-access-m5vpk\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.442040 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.441920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d58a911-4968-4350-90b2-66ae9a65496d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.442040 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.442006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d58a911-4968-4350-90b2-66ae9a65496d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.442135 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.442081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9d58a911-4968-4350-90b2-66ae9a65496d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.542547 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.542522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d58a911-4968-4350-90b2-66ae9a65496d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.542687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.542584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9d58a911-4968-4350-90b2-66ae9a65496d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.542687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.542646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vpk\" (UniqueName: \"kubernetes.io/projected/9d58a911-4968-4350-90b2-66ae9a65496d-kube-api-access-m5vpk\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.542687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.542678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d58a911-4968-4350-90b2-66ae9a65496d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.543345 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.543300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d58a911-4968-4350-90b2-66ae9a65496d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.545685 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.545660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9d58a911-4968-4350-90b2-66ae9a65496d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.545793 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.545700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d58a911-4968-4350-90b2-66ae9a65496d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.554341 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.554316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vpk\" (UniqueName: \"kubernetes.io/projected/9d58a911-4968-4350-90b2-66ae9a65496d-kube-api-access-m5vpk\") pod \"prometheus-operator-5676c8c784-b9rvv\" (UID: \"9d58a911-4968-4350-90b2-66ae9a65496d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.660035 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.660011 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" Apr 21 16:04:30.779277 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.779233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rtv56" event={"ID":"7da6b8da-7bb2-4107-8411-5fe4591aa3d3","Type":"ContainerStarted","Data":"32913dc3512d5e6f527ce924bbf9a479bb25738730bb9f86ed8ca26fb3f41bc9"} Apr 21 16:04:30.806305 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:30.806261 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rtv56" podStartSLOduration=1.417030805 podStartE2EDuration="3.806243779s" podCreationTimestamp="2026-04-21 16:04:27 +0000 UTC" firstStartedPulling="2026-04-21 16:04:27.637040943 +0000 UTC m=+156.013222733" lastFinishedPulling="2026-04-21 16:04:30.026253914 +0000 UTC m=+158.402435707" observedRunningTime="2026-04-21 16:04:30.805569944 +0000 UTC m=+159.181751757" watchObservedRunningTime="2026-04-21 16:04:30.806243779 +0000 UTC m=+159.182425592" Apr 21 16:04:31.495038 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:31.495009 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-b9rvv"] Apr 21 16:04:31.499099 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:31.499074 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d58a911_4968_4350_90b2_66ae9a65496d.slice/crio-63eea8e94b2ea4b6f17288d1f5192b348af885ad6b6df567c3c2deabbc57a34f WatchSource:0}: Error finding container 63eea8e94b2ea4b6f17288d1f5192b348af885ad6b6df567c3c2deabbc57a34f: Status 404 returned error can't find the container with id 63eea8e94b2ea4b6f17288d1f5192b348af885ad6b6df567c3c2deabbc57a34f Apr 21 16:04:31.783039 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:31.782998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" event={"ID":"9d58a911-4968-4350-90b2-66ae9a65496d","Type":"ContainerStarted","Data":"63eea8e94b2ea4b6f17288d1f5192b348af885ad6b6df567c3c2deabbc57a34f"} Apr 21 16:04:31.784337 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:31.784311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7f6dc48-vpmnt" event={"ID":"eb95460d-65fc-422a-bfe4-e7a0d9b427bf","Type":"ContainerStarted","Data":"1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689"} Apr 21 16:04:31.810122 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:31.810084 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c7f6dc48-vpmnt" podStartSLOduration=0.941229179 podStartE2EDuration="3.810074527s" podCreationTimestamp="2026-04-21 16:04:28 +0000 UTC" firstStartedPulling="2026-04-21 16:04:28.530311123 +0000 UTC m=+156.906492916" lastFinishedPulling="2026-04-21 16:04:31.399156471 +0000 UTC m=+159.775338264" observedRunningTime="2026-04-21 16:04:31.809013073 +0000 UTC m=+160.185194896" watchObservedRunningTime="2026-04-21 16:04:31.810074527 +0000 UTC m=+160.186256339" Apr 21 16:04:32.459718 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:32.459688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:04:32.462164 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:32.462137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b713efcd-8ac3-49bc-a307-d7919be3e7ba-metrics-tls\") pod \"dns-default-ffpt6\" (UID: \"b713efcd-8ac3-49bc-a307-d7919be3e7ba\") " pod="openshift-dns/dns-default-ffpt6" Apr 21 16:04:32.560826 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:32.560796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:04:32.563446 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:32.563423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01c7fac-d92f-4341-ad18-e2c502d62462-cert\") pod \"ingress-canary-rsggj\" (UID: \"e01c7fac-d92f-4341-ad18-e2c502d62462\") " pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:04:32.570879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:32.570841 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2l66p\"" Apr 21 16:04:32.579079 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:32.579061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ffpt6" Apr 21 16:04:32.809965 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:32.809916 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ffpt6"] Apr 21 16:04:32.817913 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:32.817885 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb713efcd_8ac3_49bc_a307_d7919be3e7ba.slice/crio-c1b12c25d43676d4791afb95aef1542028c4c2376800df638c6ce96f74fd55f1 WatchSource:0}: Error finding container c1b12c25d43676d4791afb95aef1542028c4c2376800df638c6ce96f74fd55f1: Status 404 returned error can't find the container with id c1b12c25d43676d4791afb95aef1542028c4c2376800df638c6ce96f74fd55f1 Apr 21 16:04:33.792633 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:33.792585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" event={"ID":"9d58a911-4968-4350-90b2-66ae9a65496d","Type":"ContainerStarted","Data":"1f8da44a5d1daceeebcfb70cdda16dffb059c8306320b5c37b414eb0c69220c6"} Apr 21 16:04:33.793152 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:33.792641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" event={"ID":"9d58a911-4968-4350-90b2-66ae9a65496d","Type":"ContainerStarted","Data":"5075b1f1b87e0e092136c35da92084c1661145b610fd1c8cb59817a5d97e26e3"} Apr 21 16:04:33.793765 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:33.793738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ffpt6" event={"ID":"b713efcd-8ac3-49bc-a307-d7919be3e7ba","Type":"ContainerStarted","Data":"c1b12c25d43676d4791afb95aef1542028c4c2376800df638c6ce96f74fd55f1"} Apr 21 16:04:33.820264 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:33.820200 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-b9rvv" podStartSLOduration=2.589034228 podStartE2EDuration="3.820178286s" podCreationTimestamp="2026-04-21 16:04:30 +0000 UTC" firstStartedPulling="2026-04-21 16:04:31.50106266 +0000 UTC m=+159.877244450" lastFinishedPulling="2026-04-21 16:04:32.732206709 +0000 UTC m=+161.108388508" observedRunningTime="2026-04-21 16:04:33.818984159 +0000 UTC m=+162.195165971" watchObservedRunningTime="2026-04-21 16:04:33.820178286 +0000 UTC m=+162.196360098" Apr 21 16:04:34.798065 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:34.798028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ffpt6" event={"ID":"b713efcd-8ac3-49bc-a307-d7919be3e7ba","Type":"ContainerStarted","Data":"253a72fd9b7c9834ce5aa991f5831530714dd9def4c1943e95495cf29b880c70"} Apr 21 16:04:34.798065 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:34.798069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ffpt6" event={"ID":"b713efcd-8ac3-49bc-a307-d7919be3e7ba","Type":"ContainerStarted","Data":"8b4a54ae95df6786204b2f4da50964f42e96168360af2fdfcfa9816192d62f38"} Apr 21 16:04:35.806488 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.806459 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ffpt6" Apr 21 16:04:35.937136 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.937088 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ffpt6" podStartSLOduration=130.725994017 podStartE2EDuration="2m11.937073442s" podCreationTimestamp="2026-04-21 16:02:24 +0000 UTC" firstStartedPulling="2026-04-21 16:04:32.819524187 +0000 UTC m=+161.195705976" lastFinishedPulling="2026-04-21 16:04:34.030603611 +0000 UTC m=+162.406785401" observedRunningTime="2026-04-21 16:04:34.824010015 +0000 UTC m=+163.200191826" watchObservedRunningTime="2026-04-21 16:04:35.937073442 +0000 UTC m=+164.313255286" Apr 21 16:04:35.937988 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.937955 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4"] Apr 21 16:04:35.940305 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.940291 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jqg7t"] Apr 21 16:04:35.940448 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.940432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:35.942286 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.942266 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:35.962664 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.962641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 16:04:35.962767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.962754 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 16:04:35.963305 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.963291 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 16:04:35.963503 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.963489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6sdfq\"" Apr 21 16:04:35.965998 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.965984 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-kvj79\"" Apr 21 16:04:35.966102 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.966088 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 16:04:35.971603 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.971580 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 16:04:35.973951 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:35.973923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4"] Apr 21 16:04:36.014285 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.014262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jqg7t"] Apr 21 16:04:36.091575 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.091575 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/696488ce-f740-4074-a939-4496126afadd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.091575 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxck\" (UniqueName: \"kubernetes.io/projected/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-api-access-fhxck\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.091730 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b732a56-615f-485f-8b03-9e7618e45b3a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.091730 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.091730 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.091730 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.091730 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091728 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3b732a56-615f-485f-8b03-9e7618e45b3a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.091916 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/696488ce-f740-4074-a939-4496126afadd-kube-api-access-n2k7v\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.091916 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.091784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.192128 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/696488ce-f740-4074-a939-4496126afadd-kube-api-access-n2k7v\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.192231 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.192231 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.192231 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/696488ce-f740-4074-a939-4496126afadd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.192403 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxck\" (UniqueName: \"kubernetes.io/projected/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-api-access-fhxck\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.192403 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b732a56-615f-485f-8b03-9e7618e45b3a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.192511 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.192511 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.192511 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.192668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.192524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3b732a56-615f-485f-8b03-9e7618e45b3a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.192668 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.192628 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 16:04:36.192822 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.192686 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-tls podName:3b732a56-615f-485f-8b03-9e7618e45b3a nodeName:}" failed. No retries permitted until 2026-04-21 16:04:36.692669202 +0000 UTC m=+165.068850992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jqg7t" (UID: "3b732a56-615f-485f-8b03-9e7618e45b3a") : secret "kube-state-metrics-tls" not found Apr 21 16:04:36.192822 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.192686 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 16:04:36.192822 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.192809 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls podName:696488ce-f740-4074-a939-4496126afadd nodeName:}" failed. No retries permitted until 2026-04-21 16:04:36.692742767 +0000 UTC m=+165.068924612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-vc9j4" (UID: "696488ce-f740-4074-a939-4496126afadd") : secret "openshift-state-metrics-tls" not found Apr 21 16:04:36.193173 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.193146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/696488ce-f740-4074-a939-4496126afadd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.193270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.193167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3b732a56-615f-485f-8b03-9e7618e45b3a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.193270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.193168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.193270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.193247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b732a56-615f-485f-8b03-9e7618e45b3a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.194494 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.194475 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qzhmc"] Apr 21 16:04:36.195014 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.194988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.195123 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.195104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.197463 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.197445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.227485 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.227466 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mt6fm\"" Apr 21 16:04:36.229510 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.229492 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 16:04:36.229697 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.229683 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 16:04:36.230720 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.230705 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 16:04:36.248203 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.248186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxck\" (UniqueName: \"kubernetes.io/projected/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-api-access-fhxck\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.261427 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.261408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/696488ce-f740-4074-a939-4496126afadd-kube-api-access-n2k7v\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.394453 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-tls\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394540 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394540 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdgc6\" (UniqueName: \"kubernetes.io/projected/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-kube-api-access-jdgc6\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394658 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-sys\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394658 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-wtmp\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394658 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394766 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-metrics-client-ca\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394766 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-root\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.394847 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.394783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-textfile\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495402 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-tls\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495493 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495493 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdgc6\" (UniqueName: \"kubernetes.io/projected/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-kube-api-access-jdgc6\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-sys\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495578 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.495519 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 16:04:36.495578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-wtmp\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495578 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.495573 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-tls podName:4f95fc4b-4e03-4d17-9691-dd89e4ec7182 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:36.995555753 +0000 UTC m=+165.371737553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-tls") pod "node-exporter-qzhmc" (UID: "4f95fc4b-4e03-4d17-9691-dd89e4ec7182") : secret "node-exporter-tls" not found Apr 21 16:04:36.495826 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-metrics-client-ca\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495826 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-root\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495826 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-sys\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495826 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-wtmp\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495826 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-textfile\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.495826 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.495791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-root\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.496071 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.496054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.496235 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.496217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-metrics-client-ca\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.496629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.496614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-textfile\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.497413 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.497398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.563572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.563549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdgc6\" (UniqueName: \"kubernetes.io/projected/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-kube-api-access-jdgc6\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:36.698105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.698031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:36.698310 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.698285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.698440 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.698223 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 16:04:36.698440 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:36.698404 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls podName:696488ce-f740-4074-a939-4496126afadd nodeName:}" failed. No retries permitted until 2026-04-21 16:04:37.698381832 +0000 UTC m=+166.074563635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-vc9j4" (UID: "696488ce-f740-4074-a939-4496126afadd") : secret "openshift-state-metrics-tls" not found Apr 21 16:04:36.700712 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.700679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b732a56-615f-485f-8b03-9e7618e45b3a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jqg7t\" (UID: \"3b732a56-615f-485f-8b03-9e7618e45b3a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.854507 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.854469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" Apr 21 16:04:36.985850 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:36.985781 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jqg7t"] Apr 21 16:04:36.989524 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:36.989487 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b732a56_615f_485f_8b03_9e7618e45b3a.slice/crio-9a618f5ddcd13c8a672e770f3476e81cfd9762ed82961043768eb33189644021 WatchSource:0}: Error finding container 9a618f5ddcd13c8a672e770f3476e81cfd9762ed82961043768eb33189644021: Status 404 returned error can't find the container with id 9a618f5ddcd13c8a672e770f3476e81cfd9762ed82961043768eb33189644021 Apr 21 16:04:37.001249 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.001222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-tls\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:37.003336 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.003316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4f95fc4b-4e03-4d17-9691-dd89e4ec7182-node-exporter-tls\") pod \"node-exporter-qzhmc\" (UID: \"4f95fc4b-4e03-4d17-9691-dd89e4ec7182\") " pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:37.111355 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.111333 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qzhmc" Apr 21 16:04:37.119471 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:37.119444 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f95fc4b_4e03_4d17_9691_dd89e4ec7182.slice/crio-c5bc8dc2c58374ce73c9663732fa0ae1b1eebba6fc32c1f1b11d26c8d434ef0a WatchSource:0}: Error finding container c5bc8dc2c58374ce73c9663732fa0ae1b1eebba6fc32c1f1b11d26c8d434ef0a: Status 404 returned error can't find the container with id c5bc8dc2c58374ce73c9663732fa0ae1b1eebba6fc32c1f1b11d26c8d434ef0a Apr 21 16:04:37.707054 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.707018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:37.711309 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.711263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/696488ce-f740-4074-a939-4496126afadd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vc9j4\" (UID: \"696488ce-f740-4074-a939-4496126afadd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:37.750543 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.750360 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" Apr 21 16:04:37.819589 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.819535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzhmc" event={"ID":"4f95fc4b-4e03-4d17-9691-dd89e4ec7182","Type":"ContainerStarted","Data":"c5bc8dc2c58374ce73c9663732fa0ae1b1eebba6fc32c1f1b11d26c8d434ef0a"} Apr 21 16:04:37.821718 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.821670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" event={"ID":"3b732a56-615f-485f-8b03-9e7618e45b3a","Type":"ContainerStarted","Data":"9a618f5ddcd13c8a672e770f3476e81cfd9762ed82961043768eb33189644021"} Apr 21 16:04:37.927550 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:37.927478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4"] Apr 21 16:04:38.072137 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:38.072066 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696488ce_f740_4074_a939_4496126afadd.slice/crio-5e5637585a235030af0229235c1c8acf930856f86da71a556299f23fbe719b9a WatchSource:0}: Error finding container 5e5637585a235030af0229235c1c8acf930856f86da71a556299f23fbe719b9a: Status 404 returned error can't find the container with id 5e5637585a235030af0229235c1c8acf930856f86da71a556299f23fbe719b9a Apr 21 16:04:38.247769 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.247729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:04:38.250633 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.250610 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76wnm\"" Apr 21 16:04:38.259159 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.259139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rsggj" Apr 21 16:04:38.401377 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.401344 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:38.401377 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.401383 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:04:38.402736 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.402711 2576 patch_prober.go:28] interesting pod/console-7c7f6dc48-vpmnt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.14:8443/health\": dial tcp 10.134.0.14:8443: connect: connection refused" start-of-body= Apr 21 16:04:38.402821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.402759 2576 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7c7f6dc48-vpmnt" podUID="eb95460d-65fc-422a-bfe4-e7a0d9b427bf" containerName="console" probeResult="failure" output="Get \"https://10.134.0.14:8443/health\": dial tcp 10.134.0.14:8443: connect: connection refused" Apr 21 16:04:38.569750 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.569726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rsggj"] Apr 21 16:04:38.577777 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:38.576515 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode01c7fac_d92f_4341_ad18_e2c502d62462.slice/crio-ee291651116197f02d4c63f8e62f9e13f613f21e6b23ef32ad5717b8d425a7e8 WatchSource:0}: Error finding container ee291651116197f02d4c63f8e62f9e13f613f21e6b23ef32ad5717b8d425a7e8: Status 404 returned error can't find the container with id ee291651116197f02d4c63f8e62f9e13f613f21e6b23ef32ad5717b8d425a7e8 Apr 21 16:04:38.826475 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.826441 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f95fc4b-4e03-4d17-9691-dd89e4ec7182" containerID="7171e3db024c23ddadafb92ad49674c541a94a20bac371138918df06fbcf3242" exitCode=0 Apr 21 16:04:38.826685 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.826541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzhmc" event={"ID":"4f95fc4b-4e03-4d17-9691-dd89e4ec7182","Type":"ContainerDied","Data":"7171e3db024c23ddadafb92ad49674c541a94a20bac371138918df06fbcf3242"} Apr 21 16:04:38.828520 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.828491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" event={"ID":"3b732a56-615f-485f-8b03-9e7618e45b3a","Type":"ContainerStarted","Data":"ea72bde0945cd2a05f95f3ef719b1be4bddec57f496b6d076e4a21cc50513287"} Apr 21 16:04:38.828661 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.828531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" event={"ID":"3b732a56-615f-485f-8b03-9e7618e45b3a","Type":"ContainerStarted","Data":"4e08c26255ed387c35df3a14972bdd02fd54f8909d56d0cebb847b00ea4b8a04"} Apr 21 16:04:38.828661 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.828548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" event={"ID":"3b732a56-615f-485f-8b03-9e7618e45b3a","Type":"ContainerStarted","Data":"956ff9ce0e9b7a191fd5eeae3a23a47688f5212a7e905b656ac4f3a878640f22"} Apr 21 16:04:38.829647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.829615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rsggj" event={"ID":"e01c7fac-d92f-4341-ad18-e2c502d62462","Type":"ContainerStarted","Data":"ee291651116197f02d4c63f8e62f9e13f613f21e6b23ef32ad5717b8d425a7e8"} Apr 21 16:04:38.831850 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.831826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" event={"ID":"696488ce-f740-4074-a939-4496126afadd","Type":"ContainerStarted","Data":"6cb267370374eb90e45a016be5e19227b05a15af47c4eae212446cacf8bd7200"} Apr 21 16:04:38.831850 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.831870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" event={"ID":"696488ce-f740-4074-a939-4496126afadd","Type":"ContainerStarted","Data":"b1203ab4535c7b6957b5921e9e107f6f8ebbd660d94911f1da291431296e0db9"} Apr 21 16:04:38.832004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.831883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" event={"ID":"696488ce-f740-4074-a939-4496126afadd","Type":"ContainerStarted","Data":"5e5637585a235030af0229235c1c8acf930856f86da71a556299f23fbe719b9a"} Apr 21 16:04:38.944642 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.944583 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jqg7t" podStartSLOduration=2.516120911 podStartE2EDuration="3.944560852s" podCreationTimestamp="2026-04-21 16:04:35 +0000 UTC" firstStartedPulling="2026-04-21 16:04:36.99146004 +0000 UTC m=+165.367641829" lastFinishedPulling="2026-04-21 16:04:38.419899973 +0000 UTC m=+166.796081770" observedRunningTime="2026-04-21 16:04:38.877265697 +0000 UTC m=+167.253447508" watchObservedRunningTime="2026-04-21 16:04:38.944560852 +0000 UTC m=+167.320742665" Apr 21 16:04:38.945940 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.945912 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645"] Apr 21 16:04:38.949263 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.949238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:38.952808 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.952781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 16:04:38.953560 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.953432 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 16:04:38.953875 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.953833 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 16:04:38.954101 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.954084 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 16:04:38.954282 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.954267 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 16:04:38.954464 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.954450 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-f2l7vi6ddmebn\"" Apr 21 16:04:38.954677 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.954661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-5gzs9\"" Apr 21 16:04:38.963300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:38.963269 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645"] Apr 21 16:04:39.023447 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.023623 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.023623 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-grpc-tls\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.023623 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.023623 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-tls\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.023806 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrk5\" (UniqueName: \"kubernetes.io/projected/99da288f-71c1-4f4b-8101-b0a009cb6add-kube-api-access-zvrk5\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.023806 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023750 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.023895 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.023830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99da288f-71c1-4f4b-8101-b0a009cb6add-metrics-client-ca\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.124970 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.124925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-grpc-tls\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.125158 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.124988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.125158 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.125030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-tls\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.125158 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.125071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvrk5\" (UniqueName: \"kubernetes.io/projected/99da288f-71c1-4f4b-8101-b0a009cb6add-kube-api-access-zvrk5\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.125158 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.125102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.125158 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.125141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99da288f-71c1-4f4b-8101-b0a009cb6add-metrics-client-ca\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.125612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.125168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.125612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.125209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.126911 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.126549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99da288f-71c1-4f4b-8101-b0a009cb6add-metrics-client-ca\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.128332 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.128303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-grpc-tls\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.128672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.128648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-tls\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.129525 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.129493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.130097 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.129939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.130097 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.130063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.131396 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.131346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/99da288f-71c1-4f4b-8101-b0a009cb6add-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.135687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.135643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvrk5\" (UniqueName: \"kubernetes.io/projected/99da288f-71c1-4f4b-8101-b0a009cb6add-kube-api-access-zvrk5\") pod \"thanos-querier-7c9b5b6fd8-fz645\" (UID: \"99da288f-71c1-4f4b-8101-b0a009cb6add\") " pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.261558 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.261458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:39.436981 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.436948 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645"] Apr 21 16:04:39.550506 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:39.550427 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99da288f_71c1_4f4b_8101_b0a009cb6add.slice/crio-d42cf9df2f0a845a426683d8317e959084286c10a15dcf024abc84eea65c55ab WatchSource:0}: Error finding container d42cf9df2f0a845a426683d8317e959084286c10a15dcf024abc84eea65c55ab: Status 404 returned error can't find the container with id d42cf9df2f0a845a426683d8317e959084286c10a15dcf024abc84eea65c55ab Apr 21 16:04:39.836229 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.836145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" event={"ID":"99da288f-71c1-4f4b-8101-b0a009cb6add","Type":"ContainerStarted","Data":"d42cf9df2f0a845a426683d8317e959084286c10a15dcf024abc84eea65c55ab"} Apr 21 16:04:39.838330 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.838299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzhmc" event={"ID":"4f95fc4b-4e03-4d17-9691-dd89e4ec7182","Type":"ContainerStarted","Data":"89218ffe244efc7a9e4869be440a0c8624118f6cbbe0826af5fd7b456f7c3d6c"} Apr 21 16:04:39.838435 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.838332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzhmc" event={"ID":"4f95fc4b-4e03-4d17-9691-dd89e4ec7182","Type":"ContainerStarted","Data":"039dce6e5d223d8bc9fa14c5a6ef9b5bf60615d71f55c82e0ed50b4c48637252"} Apr 21 16:04:39.840664 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.840634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" event={"ID":"696488ce-f740-4074-a939-4496126afadd","Type":"ContainerStarted","Data":"565e11f75a735dba012fc46bea4a312c70d9191b9668fefe630211d2edbe598f"} Apr 21 16:04:39.859389 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.859334 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qzhmc" podStartSLOduration=2.565348225 podStartE2EDuration="3.859322219s" podCreationTimestamp="2026-04-21 16:04:36 +0000 UTC" firstStartedPulling="2026-04-21 16:04:37.121027861 +0000 UTC m=+165.497209650" lastFinishedPulling="2026-04-21 16:04:38.415001841 +0000 UTC m=+166.791183644" observedRunningTime="2026-04-21 16:04:39.85809472 +0000 UTC m=+168.234276532" watchObservedRunningTime="2026-04-21 16:04:39.859322219 +0000 UTC m=+168.235504066" Apr 21 16:04:39.877524 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:39.877471 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vc9j4" podStartSLOduration=3.862644222 podStartE2EDuration="4.877456942s" podCreationTimestamp="2026-04-21 16:04:35 +0000 UTC" firstStartedPulling="2026-04-21 16:04:38.56450182 +0000 UTC m=+166.940683616" lastFinishedPulling="2026-04-21 16:04:39.579314547 +0000 UTC m=+167.955496336" observedRunningTime="2026-04-21 16:04:39.876058887 +0000 UTC m=+168.252240721" watchObservedRunningTime="2026-04-21 16:04:39.877456942 +0000 UTC m=+168.253638753" Apr 21 16:04:40.473077 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.472992 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k"] Apr 21 16:04:40.475604 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.475582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:40.478498 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.478312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 16:04:40.478498 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.478316 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-cjxwk\"" Apr 21 16:04:40.487308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.487287 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k"] Apr 21 16:04:40.541345 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.541322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83af9817-94a0-4dbd-a6b9-f7220da08c3b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dqm5k\" (UID: \"83af9817-94a0-4dbd-a6b9-f7220da08c3b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:40.642664 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.642632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83af9817-94a0-4dbd-a6b9-f7220da08c3b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dqm5k\" (UID: \"83af9817-94a0-4dbd-a6b9-f7220da08c3b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:40.642822 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:40.642801 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 16:04:40.642929 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:40.642892 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83af9817-94a0-4dbd-a6b9-f7220da08c3b-monitoring-plugin-cert podName:83af9817-94a0-4dbd-a6b9-f7220da08c3b nodeName:}" failed. No retries permitted until 2026-04-21 16:04:41.14287093 +0000 UTC m=+169.519052736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/83af9817-94a0-4dbd-a6b9-f7220da08c3b-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-dqm5k" (UID: "83af9817-94a0-4dbd-a6b9-f7220da08c3b") : secret "monitoring-plugin-cert" not found Apr 21 16:04:40.845662 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:40.845546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rsggj" event={"ID":"e01c7fac-d92f-4341-ad18-e2c502d62462","Type":"ContainerStarted","Data":"cd620301dd77fe881ab79aa060ca1aa1d56ba029f5cd8d5b6b0ef1d869af4b51"} Apr 21 16:04:41.146599 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.146560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83af9817-94a0-4dbd-a6b9-f7220da08c3b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dqm5k\" (UID: \"83af9817-94a0-4dbd-a6b9-f7220da08c3b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:41.149251 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.149222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83af9817-94a0-4dbd-a6b9-f7220da08c3b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dqm5k\" (UID: \"83af9817-94a0-4dbd-a6b9-f7220da08c3b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:41.247343 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.247311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:04:41.387828 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.387803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:41.477278 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.477179 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rsggj" podStartSLOduration=135.909446966 podStartE2EDuration="2m17.477163229s" podCreationTimestamp="2026-04-21 16:02:24 +0000 UTC" firstStartedPulling="2026-04-21 16:04:38.588260877 +0000 UTC m=+166.964442670" lastFinishedPulling="2026-04-21 16:04:40.155977133 +0000 UTC m=+168.532158933" observedRunningTime="2026-04-21 16:04:40.869548003 +0000 UTC m=+169.245729873" watchObservedRunningTime="2026-04-21 16:04:41.477163229 +0000 UTC m=+169.853345041" Apr 21 16:04:41.478475 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.478450 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c7f6dc48-vpmnt"] Apr 21 16:04:41.835167 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.835139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k"] Apr 21 16:04:41.843033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:04:41.843006 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83af9817_94a0_4dbd_a6b9_f7220da08c3b.slice/crio-59d760df67fccc7a4df19b3b5e9cb43aaed45ba8246b0e93c1f0a06d6e35982b WatchSource:0}: Error finding container 59d760df67fccc7a4df19b3b5e9cb43aaed45ba8246b0e93c1f0a06d6e35982b: Status 404 returned error can't find the container with id 59d760df67fccc7a4df19b3b5e9cb43aaed45ba8246b0e93c1f0a06d6e35982b Apr 21 16:04:41.855666 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.855475 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" event={"ID":"99da288f-71c1-4f4b-8101-b0a009cb6add","Type":"ContainerStarted","Data":"d5fc7647ab32f6fd1a7092d7c032e2bc16019e29fba0505035ae2bd6fc52308a"} Apr 21 16:04:41.857174 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:41.857149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" event={"ID":"83af9817-94a0-4dbd-a6b9-f7220da08c3b","Type":"ContainerStarted","Data":"59d760df67fccc7a4df19b3b5e9cb43aaed45ba8246b0e93c1f0a06d6e35982b"} Apr 21 16:04:42.861210 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:42.861184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" event={"ID":"99da288f-71c1-4f4b-8101-b0a009cb6add","Type":"ContainerStarted","Data":"6db012a9aeb1eef9a5d64cefd5345325891ffb7aa4fa6cbd6e9f1ffba08d8c06"} Apr 21 16:04:42.861453 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:42.861220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" event={"ID":"99da288f-71c1-4f4b-8101-b0a009cb6add","Type":"ContainerStarted","Data":"8a3908738407d59c98430ca4a6264227bf7a97972879d1078b4b86ffa5be8c50"} Apr 21 16:04:43.866763 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.866727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" event={"ID":"99da288f-71c1-4f4b-8101-b0a009cb6add","Type":"ContainerStarted","Data":"cf16cc66ec60d3a64fc5f9d69f769e4d7828579b7edaef61004343c87f50ad1b"} Apr 21 16:04:43.866763 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.866764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" event={"ID":"99da288f-71c1-4f4b-8101-b0a009cb6add","Type":"ContainerStarted","Data":"a95bb671e2d3589aa73b754db7de52ddb5b60461497dc3bb9e0e24159448deb6"} Apr 21 16:04:43.867267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.866773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" event={"ID":"99da288f-71c1-4f4b-8101-b0a009cb6add","Type":"ContainerStarted","Data":"21a57af6a129823638724cc15e221595ffa5d3febf7658db5036dcba716c5686"} Apr 21 16:04:43.867267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.867052 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:43.868152 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.868122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" event={"ID":"83af9817-94a0-4dbd-a6b9-f7220da08c3b","Type":"ContainerStarted","Data":"b8ab29ce2eaa333db77354abd08818863ebaff6a34316ae72471af714998d956"} Apr 21 16:04:43.868306 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.868293 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:43.876983 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.873974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" Apr 21 16:04:43.899412 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.899276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" podStartSLOduration=2.633277402 podStartE2EDuration="5.899262789s" podCreationTimestamp="2026-04-21 16:04:38 +0000 UTC" firstStartedPulling="2026-04-21 16:04:39.553223812 +0000 UTC m=+167.929405604" lastFinishedPulling="2026-04-21 16:04:42.819209202 +0000 UTC m=+171.195390991" observedRunningTime="2026-04-21 16:04:43.898881964 +0000 UTC m=+172.275063776" watchObservedRunningTime="2026-04-21 16:04:43.899262789 +0000 UTC m=+172.275444600" Apr 21 16:04:43.928261 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:43.928221 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dqm5k" podStartSLOduration=2.439632513 podStartE2EDuration="3.928212176s" podCreationTimestamp="2026-04-21 16:04:40 +0000 UTC" firstStartedPulling="2026-04-21 16:04:41.844901398 +0000 UTC m=+170.221083191" lastFinishedPulling="2026-04-21 16:04:43.333481061 +0000 UTC m=+171.709662854" observedRunningTime="2026-04-21 16:04:43.926852177 +0000 UTC m=+172.303033989" watchObservedRunningTime="2026-04-21 16:04:43.928212176 +0000 UTC m=+172.304393984" Apr 21 16:04:45.812154 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:45.812121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ffpt6" Apr 21 16:04:47.772164 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:47.772140 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:49.876625 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:49.876594 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7c9b5b6fd8-fz645" Apr 21 16:04:52.784583 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:52.784511 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-84699598f-cjccv" podUID="94cbed55-afd1-4b53-b7c5-b3cc3490ade8" containerName="registry" containerID="cri-o://456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1" gracePeriod=30 Apr 21 16:04:53.026114 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.026092 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:53.142643 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142617 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-ca-trust-extracted\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.142750 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142661 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-installation-pull-secrets\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.142750 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142705 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-image-registry-private-configuration\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.142892 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142751 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-bound-sa-token\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.142892 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142806 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-trusted-ca\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.142892 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142830 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-certificates\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.142892 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-tls\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.143031 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.142916 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkrwl\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-kube-api-access-wkrwl\") pod \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\" (UID: \"94cbed55-afd1-4b53-b7c5-b3cc3490ade8\") " Apr 21 16:04:53.143507 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.143434 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:04:53.143956 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.143915 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:04:53.145246 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.145222 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:04:53.145354 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.145335 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:04:53.145424 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.145402 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:04:53.145476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.145423 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-kube-api-access-wkrwl" (OuterVolumeSpecName: "kube-api-access-wkrwl") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "kube-api-access-wkrwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:04:53.145661 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.145642 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:04:53.151493 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.151472 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "94cbed55-afd1-4b53-b7c5-b3cc3490ade8" (UID: "94cbed55-afd1-4b53-b7c5-b3cc3490ade8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:53.243845 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243809 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-installation-pull-secrets\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.243845 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243842 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-image-registry-private-configuration\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.243987 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243853 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-bound-sa-token\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.243987 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243883 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-trusted-ca\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.243987 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243891 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-certificates\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.243987 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243900 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-registry-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.243987 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243909 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wkrwl\" (UniqueName: \"kubernetes.io/projected/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-kube-api-access-wkrwl\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.243987 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.243917 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94cbed55-afd1-4b53-b7c5-b3cc3490ade8-ca-trust-extracted\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:04:53.897148 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.897116 2576 generic.go:358] "Generic (PLEG): container finished" podID="94cbed55-afd1-4b53-b7c5-b3cc3490ade8" containerID="456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1" exitCode=0 Apr 21 16:04:53.897572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.897165 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84699598f-cjccv" event={"ID":"94cbed55-afd1-4b53-b7c5-b3cc3490ade8","Type":"ContainerDied","Data":"456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1"} Apr 21 16:04:53.897572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.897182 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84699598f-cjccv" Apr 21 16:04:53.897572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.897196 2576 scope.go:117] "RemoveContainer" containerID="456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1" Apr 21 16:04:53.897572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.897186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84699598f-cjccv" event={"ID":"94cbed55-afd1-4b53-b7c5-b3cc3490ade8","Type":"ContainerDied","Data":"bf9e4201d7b6fb9f6ea21ea775b54b7faaabc19d4ae64582cc49fe8ddad5bd99"} Apr 21 16:04:53.906340 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.906324 2576 scope.go:117] "RemoveContainer" containerID="456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1" Apr 21 16:04:53.906584 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:04:53.906563 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1\": container with ID starting with 456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1 not found: ID does not exist" containerID="456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1" Apr 21 16:04:53.906643 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.906591 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1"} err="failed to get container status \"456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1\": rpc error: code = NotFound desc = could not find container \"456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1\": container with ID starting with 456ad4b20673d24f37ed7cc851ce52dc0828e0433ec21e577ad433cf78375ec1 not found: ID does not exist" Apr 21 16:04:53.921256 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.921233 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84699598f-cjccv"] Apr 21 16:04:53.929673 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:53.929652 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-84699598f-cjccv"] Apr 21 16:04:54.252236 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:04:54.252158 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94cbed55-afd1-4b53-b7c5-b3cc3490ade8" path="/var/lib/kubelet/pods/94cbed55-afd1-4b53-b7c5-b3cc3490ade8/volumes" Apr 21 16:05:06.500353 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.500284 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c7f6dc48-vpmnt" podUID="eb95460d-65fc-422a-bfe4-e7a0d9b427bf" containerName="console" containerID="cri-o://1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689" gracePeriod=15 Apr 21 16:05:06.739078 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.739055 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c7f6dc48-vpmnt_eb95460d-65fc-422a-bfe4-e7a0d9b427bf/console/0.log" Apr 21 16:05:06.739181 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.739114 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:05:06.838090 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838032 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-oauth-config\") pod \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " Apr 21 16:05:06.838090 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838062 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-oauth-serving-cert\") pod \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " Apr 21 16:05:06.838224 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838130 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjqxg\" (UniqueName: \"kubernetes.io/projected/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-kube-api-access-vjqxg\") pod \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " Apr 21 16:05:06.838224 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838173 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-service-ca\") pod \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " Apr 21 16:05:06.838302 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-serving-cert\") pod \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " Apr 21 16:05:06.838302 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838253 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-config\") pod \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\" (UID: \"eb95460d-65fc-422a-bfe4-e7a0d9b427bf\") " Apr 21 16:05:06.838627 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838590 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-service-ca" (OuterVolumeSpecName: "service-ca") pod "eb95460d-65fc-422a-bfe4-e7a0d9b427bf" (UID: "eb95460d-65fc-422a-bfe4-e7a0d9b427bf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:06.838733 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838634 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eb95460d-65fc-422a-bfe4-e7a0d9b427bf" (UID: "eb95460d-65fc-422a-bfe4-e7a0d9b427bf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:06.838733 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.838696 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-config" (OuterVolumeSpecName: "console-config") pod "eb95460d-65fc-422a-bfe4-e7a0d9b427bf" (UID: "eb95460d-65fc-422a-bfe4-e7a0d9b427bf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:06.840270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.840239 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eb95460d-65fc-422a-bfe4-e7a0d9b427bf" (UID: "eb95460d-65fc-422a-bfe4-e7a0d9b427bf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:06.840375 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.840357 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eb95460d-65fc-422a-bfe4-e7a0d9b427bf" (UID: "eb95460d-65fc-422a-bfe4-e7a0d9b427bf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:06.840455 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.840433 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-kube-api-access-vjqxg" (OuterVolumeSpecName: "kube-api-access-vjqxg") pod "eb95460d-65fc-422a-bfe4-e7a0d9b427bf" (UID: "eb95460d-65fc-422a-bfe4-e7a0d9b427bf"). InnerVolumeSpecName "kube-api-access-vjqxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:06.934116 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.934097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c7f6dc48-vpmnt_eb95460d-65fc-422a-bfe4-e7a0d9b427bf/console/0.log" Apr 21 16:05:06.934199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.934131 2576 generic.go:358] "Generic (PLEG): container finished" podID="eb95460d-65fc-422a-bfe4-e7a0d9b427bf" containerID="1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689" exitCode=2 Apr 21 16:05:06.934199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.934175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7f6dc48-vpmnt" event={"ID":"eb95460d-65fc-422a-bfe4-e7a0d9b427bf","Type":"ContainerDied","Data":"1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689"} Apr 21 16:05:06.934199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.934189 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7f6dc48-vpmnt" Apr 21 16:05:06.934199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.934196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7f6dc48-vpmnt" event={"ID":"eb95460d-65fc-422a-bfe4-e7a0d9b427bf","Type":"ContainerDied","Data":"31a9de4d8927344facb2e40abd985d7d2fc4b8239f8d45b5296ae2165dd98355"} Apr 21 16:05:06.934378 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.934210 2576 scope.go:117] "RemoveContainer" containerID="1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689" Apr 21 16:05:06.938824 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.938796 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:05:06.938824 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.938822 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:05:06.938983 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.938839 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-console-oauth-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:05:06.938983 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.938879 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-oauth-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:05:06.938983 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.938895 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vjqxg\" (UniqueName: \"kubernetes.io/projected/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-kube-api-access-vjqxg\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:05:06.938983 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.938910 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb95460d-65fc-422a-bfe4-e7a0d9b427bf-service-ca\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:05:06.941578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.941562 2576 scope.go:117] "RemoveContainer" containerID="1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689" Apr 21 16:05:06.941957 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:05:06.941819 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689\": container with ID starting with 1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689 not found: ID does not exist" containerID="1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689" Apr 21 16:05:06.941957 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.941849 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689"} err="failed to get container status \"1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689\": rpc error: code = NotFound desc = could not find container \"1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689\": container with ID starting with 1a2ec68545f00fb6d85f02d8a6fde3a20693786105cb79b6a24174d006ae8689 not found: ID does not exist" Apr 21 16:05:06.955475 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.955453 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c7f6dc48-vpmnt"] Apr 21 16:05:06.959582 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:06.959562 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c7f6dc48-vpmnt"] Apr 21 16:05:08.251776 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:08.251740 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb95460d-65fc-422a-bfe4-e7a0d9b427bf" path="/var/lib/kubelet/pods/eb95460d-65fc-422a-bfe4-e7a0d9b427bf/volumes" Apr 21 16:05:10.946029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:10.945989 2576 generic.go:358] "Generic (PLEG): container finished" podID="3fb650da-0a2a-4d1a-b6d9-01cfaef03e45" containerID="0631cdef1fcebfde570ff2c7cf5c489878c592f2de858c16881534f81d37146a" exitCode=0 Apr 21 16:05:10.946408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:10.946068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" event={"ID":"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45","Type":"ContainerDied","Data":"0631cdef1fcebfde570ff2c7cf5c489878c592f2de858c16881534f81d37146a"} Apr 21 16:05:10.946456 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:10.946418 2576 scope.go:117] "RemoveContainer" containerID="0631cdef1fcebfde570ff2c7cf5c489878c592f2de858c16881534f81d37146a" Apr 21 16:05:11.950431 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:05:11.950401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-w4j6r" event={"ID":"3fb650da-0a2a-4d1a-b6d9-01cfaef03e45","Type":"ContainerStarted","Data":"32283ee638966420c0214dbab6df88360edf59361200a6453f50e83b908f84fd"} Apr 21 16:06:00.259442 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.259408 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p"] Apr 21 16:06:00.259885 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.259706 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb95460d-65fc-422a-bfe4-e7a0d9b427bf" containerName="console" Apr 21 16:06:00.259885 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.259718 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb95460d-65fc-422a-bfe4-e7a0d9b427bf" containerName="console" Apr 21 16:06:00.259885 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.259731 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94cbed55-afd1-4b53-b7c5-b3cc3490ade8" containerName="registry" Apr 21 16:06:00.259885 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.259736 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cbed55-afd1-4b53-b7c5-b3cc3490ade8" containerName="registry" Apr 21 16:06:00.259885 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.259784 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="94cbed55-afd1-4b53-b7c5-b3cc3490ade8" containerName="registry" Apr 21 16:06:00.259885 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.259794 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb95460d-65fc-422a-bfe4-e7a0d9b427bf" containerName="console" Apr 21 16:06:00.262813 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.262789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.266023 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.265992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 16:06:00.266534 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.266503 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 16:06:00.267050 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.267025 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 16:06:00.267168 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.267035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 16:06:00.267352 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.267332 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xk9wb\"" Apr 21 16:06:00.267352 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.267347 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 16:06:00.271146 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.271129 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 16:06:00.277927 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.277902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p"] Apr 21 16:06:00.338144 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-secret-telemeter-client\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.338292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-serving-certs-ca-bundle\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.338292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-telemeter-client-tls\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.338292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-metrics-client-ca\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.338397 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.338397 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9dz\" (UniqueName: \"kubernetes.io/projected/f0d1e899-28ff-427b-87d3-5ad83798f78d-kube-api-access-8r9dz\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.338397 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-federate-client-tls\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.338397 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.338374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439131 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-metrics-client-ca\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439275 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439275 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9dz\" (UniqueName: \"kubernetes.io/projected/f0d1e899-28ff-427b-87d3-5ad83798f78d-kube-api-access-8r9dz\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439275 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-federate-client-tls\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439275 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439275 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-secret-telemeter-client\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439521 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-serving-certs-ca-bundle\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439521 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-telemeter-client-tls\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.439934 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.439909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-metrics-client-ca\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.440158 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.440133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-serving-certs-ca-bundle\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.440246 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.440145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d1e899-28ff-427b-87d3-5ad83798f78d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.441959 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.441928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-federate-client-tls\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.441959 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.441937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-secret-telemeter-client\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.442153 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.442126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.442217 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.442180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f0d1e899-28ff-427b-87d3-5ad83798f78d-telemeter-client-tls\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.447812 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.447726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9dz\" (UniqueName: \"kubernetes.io/projected/f0d1e899-28ff-427b-87d3-5ad83798f78d-kube-api-access-8r9dz\") pod \"telemeter-client-6f44d4dcfb-t2r9p\" (UID: \"f0d1e899-28ff-427b-87d3-5ad83798f78d\") " pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.574417 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.574347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" Apr 21 16:06:00.731195 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:00.731166 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p"] Apr 21 16:06:00.735595 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:06:00.735555 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d1e899_28ff_427b_87d3_5ad83798f78d.slice/crio-692f02c81b4ce9951ddca00c0cfb9f351a1e73d42b9e6d87e85be4e022194525 WatchSource:0}: Error finding container 692f02c81b4ce9951ddca00c0cfb9f351a1e73d42b9e6d87e85be4e022194525: Status 404 returned error can't find the container with id 692f02c81b4ce9951ddca00c0cfb9f351a1e73d42b9e6d87e85be4e022194525 Apr 21 16:06:01.093893 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:01.093850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" event={"ID":"f0d1e899-28ff-427b-87d3-5ad83798f78d","Type":"ContainerStarted","Data":"692f02c81b4ce9951ddca00c0cfb9f351a1e73d42b9e6d87e85be4e022194525"} Apr 21 16:06:02.960792 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:02.960765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:06:02.962947 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:02.962923 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b782d8e0-5d5e-4897-91b4-49f7049c7fac-metrics-certs\") pod \"network-metrics-daemon-clkll\" (UID: \"b782d8e0-5d5e-4897-91b4-49f7049c7fac\") " pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:06:03.100197 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:03.100163 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" event={"ID":"f0d1e899-28ff-427b-87d3-5ad83798f78d","Type":"ContainerStarted","Data":"21a93d9cc173411015c4d00c6506c0edf527bedeb298182662709fd4e84c165c"} Apr 21 16:06:03.151069 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:03.151047 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cz78x\"" Apr 21 16:06:03.159130 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:03.159112 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clkll" Apr 21 16:06:03.276980 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:03.276902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-clkll"] Apr 21 16:06:03.279368 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:06:03.279333 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb782d8e0_5d5e_4897_91b4_49f7049c7fac.slice/crio-1f15cf13c1069b6a6a95cb78336ef34ad7b0e17c4fd46d6293a61ababeff9afa WatchSource:0}: Error finding container 1f15cf13c1069b6a6a95cb78336ef34ad7b0e17c4fd46d6293a61ababeff9afa: Status 404 returned error can't find the container with id 1f15cf13c1069b6a6a95cb78336ef34ad7b0e17c4fd46d6293a61ababeff9afa Apr 21 16:06:04.103888 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:04.103828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-clkll" event={"ID":"b782d8e0-5d5e-4897-91b4-49f7049c7fac","Type":"ContainerStarted","Data":"1f15cf13c1069b6a6a95cb78336ef34ad7b0e17c4fd46d6293a61ababeff9afa"} Apr 21 16:06:05.108107 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:05.108072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" event={"ID":"f0d1e899-28ff-427b-87d3-5ad83798f78d","Type":"ContainerStarted","Data":"2867ac148dfaeb814da6a72333b92150a516934124aca3a7a3b2c1e6a7b1fd82"} Apr 21 16:06:05.108558 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:05.108114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" event={"ID":"f0d1e899-28ff-427b-87d3-5ad83798f78d","Type":"ContainerStarted","Data":"2f1bea52d47e8f0e39d74d14ca9b9b353dfedfaf5cccfffe42061c8211bb4505"} Apr 21 16:06:05.109614 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:05.109589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-clkll" event={"ID":"b782d8e0-5d5e-4897-91b4-49f7049c7fac","Type":"ContainerStarted","Data":"29b6ae17a2cddd7d92e0ef1fb3bc9c5161d1486db097219f60ec642f9b139431"} Apr 21 16:06:05.109707 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:05.109617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-clkll" event={"ID":"b782d8e0-5d5e-4897-91b4-49f7049c7fac","Type":"ContainerStarted","Data":"ed314dd792dbe6b9e3effd234ff272d1d6518b981433a36d22be9c75120f529e"} Apr 21 16:06:05.145242 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:05.145194 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6f44d4dcfb-t2r9p" podStartSLOduration=1.461108649 podStartE2EDuration="5.145178452s" podCreationTimestamp="2026-04-21 16:06:00 +0000 UTC" firstStartedPulling="2026-04-21 16:06:00.73780191 +0000 UTC m=+249.113983705" lastFinishedPulling="2026-04-21 16:06:04.421871715 +0000 UTC m=+252.798053508" observedRunningTime="2026-04-21 16:06:05.144662493 +0000 UTC m=+253.520844319" watchObservedRunningTime="2026-04-21 16:06:05.145178452 +0000 UTC m=+253.521360266" Apr 21 16:06:06.003371 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.003313 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-clkll" podStartSLOduration=252.860212849 podStartE2EDuration="4m14.003291952s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:06:03.281278781 +0000 UTC m=+251.657460575" lastFinishedPulling="2026-04-21 16:06:04.424357884 +0000 UTC m=+252.800539678" observedRunningTime="2026-04-21 16:06:05.189446686 +0000 UTC m=+253.565628496" watchObservedRunningTime="2026-04-21 16:06:06.003291952 +0000 UTC m=+254.379473764" Apr 21 16:06:06.004840 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.004820 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56c54fcdb8-sl4xc"] Apr 21 16:06:06.007268 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.007251 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.019176 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.019136 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 16:06:06.019176 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.019136 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9tkts\"" Apr 21 16:06:06.019312 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.019275 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 16:06:06.019634 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.019612 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 16:06:06.019752 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.019649 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 16:06:06.020212 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.020189 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 16:06:06.020531 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.020512 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 16:06:06.020792 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.020771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 16:06:06.023699 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.023673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c54fcdb8-sl4xc"] Apr 21 16:06:06.024298 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.024280 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 16:06:06.086825 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.086803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-serving-cert\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.086923 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.086841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-oauth-serving-cert\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.086923 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.086882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-config\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.086994 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.086945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bltgf\" (UniqueName: \"kubernetes.io/projected/b4fc923e-6920-476c-99f1-28fdc1bfb30c-kube-api-access-bltgf\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.086994 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.086972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-service-ca\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.087055 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.087012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-trusted-ca-bundle\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.087055 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.087046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-oauth-config\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.188311 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.188285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-trusted-ca-bundle\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.188634 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.188321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-oauth-config\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.188634 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.188588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-serving-cert\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.188822 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.188802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-oauth-serving-cert\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.188907 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.188840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-config\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.188970 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.188941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bltgf\" (UniqueName: \"kubernetes.io/projected/b4fc923e-6920-476c-99f1-28fdc1bfb30c-kube-api-access-bltgf\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.189027 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.188986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-service-ca\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.189247 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.189223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-trusted-ca-bundle\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.189485 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.189453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-oauth-serving-cert\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.189613 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.189594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-config\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.189689 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.189671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-service-ca\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.190834 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.190817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-oauth-config\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.191105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.191084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-serving-cert\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.202844 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.202820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bltgf\" (UniqueName: \"kubernetes.io/projected/b4fc923e-6920-476c-99f1-28fdc1bfb30c-kube-api-access-bltgf\") pod \"console-56c54fcdb8-sl4xc\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.322101 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.322050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:06.445647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:06.445624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c54fcdb8-sl4xc"] Apr 21 16:06:06.448197 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:06:06.448172 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4fc923e_6920_476c_99f1_28fdc1bfb30c.slice/crio-2c075575921dbe51e959720a98bdac722e78a369906c1067427d4d67dc109985 WatchSource:0}: Error finding container 2c075575921dbe51e959720a98bdac722e78a369906c1067427d4d67dc109985: Status 404 returned error can't find the container with id 2c075575921dbe51e959720a98bdac722e78a369906c1067427d4d67dc109985 Apr 21 16:06:07.117309 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:07.117273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c54fcdb8-sl4xc" event={"ID":"b4fc923e-6920-476c-99f1-28fdc1bfb30c","Type":"ContainerStarted","Data":"84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e"} Apr 21 16:06:07.117309 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:07.117311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c54fcdb8-sl4xc" event={"ID":"b4fc923e-6920-476c-99f1-28fdc1bfb30c","Type":"ContainerStarted","Data":"2c075575921dbe51e959720a98bdac722e78a369906c1067427d4d67dc109985"} Apr 21 16:06:07.137586 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:07.137521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56c54fcdb8-sl4xc" podStartSLOduration=2.137505202 podStartE2EDuration="2.137505202s" podCreationTimestamp="2026-04-21 16:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:06:07.135722879 +0000 UTC m=+255.511904707" watchObservedRunningTime="2026-04-21 16:06:07.137505202 +0000 UTC m=+255.513687015" Apr 21 16:06:16.322921 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:16.322799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:16.322921 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:16.322844 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:16.327950 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:16.327922 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:17.150547 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:17.150515 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:06:52.137129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:52.137096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:06:52.137600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:52.137096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:06:52.140245 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:06:52.140217 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 16:07:17.770343 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.770312 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-579bfd59c8-j698z"] Apr 21 16:07:17.772584 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.772567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.793540 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.793518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579bfd59c8-j698z"] Apr 21 16:07:17.819553 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.819526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-service-ca\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.819645 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.819556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-trusted-ca-bundle\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.819645 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.819575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-oauth-serving-cert\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.819645 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.819597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-serving-cert\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.819747 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.819650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtnh\" (UniqueName: \"kubernetes.io/projected/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-kube-api-access-frtnh\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.819747 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.819702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-oauth-config\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.819747 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.819744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-config\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.920719 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.920697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-oauth-serving-cert\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.920844 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.920730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-serving-cert\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.920844 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.920765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frtnh\" (UniqueName: \"kubernetes.io/projected/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-kube-api-access-frtnh\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.920844 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.920786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-oauth-config\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.920844 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.920834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-config\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.921082 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.920916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-service-ca\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.921082 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.920939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-trusted-ca-bundle\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.921478 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.921449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-oauth-serving-cert\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.921596 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.921572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-config\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.921739 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.921714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-service-ca\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.922415 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.922394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-trusted-ca-bundle\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.923235 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.923213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-oauth-config\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.923347 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.923327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-serving-cert\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:17.929395 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:17.929366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtnh\" (UniqueName: \"kubernetes.io/projected/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-kube-api-access-frtnh\") pod \"console-579bfd59c8-j698z\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:18.081548 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:18.081489 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:18.236781 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:18.236756 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579bfd59c8-j698z"] Apr 21 16:07:18.238569 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:07:18.238534 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode989038b_2f0d_4dc9_b6d3_fd56e2c32c7a.slice/crio-5afa2fea425a73130c040c72c2b49b28c5e3a6e6b40f6f2c0ae5d80ff3d8a248 WatchSource:0}: Error finding container 5afa2fea425a73130c040c72c2b49b28c5e3a6e6b40f6f2c0ae5d80ff3d8a248: Status 404 returned error can't find the container with id 5afa2fea425a73130c040c72c2b49b28c5e3a6e6b40f6f2c0ae5d80ff3d8a248 Apr 21 16:07:18.240427 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:18.240409 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:07:18.335436 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:18.335366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579bfd59c8-j698z" event={"ID":"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a","Type":"ContainerStarted","Data":"74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14"} Apr 21 16:07:18.335436 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:18.335404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579bfd59c8-j698z" event={"ID":"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a","Type":"ContainerStarted","Data":"5afa2fea425a73130c040c72c2b49b28c5e3a6e6b40f6f2c0ae5d80ff3d8a248"} Apr 21 16:07:18.363021 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:18.362965 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-579bfd59c8-j698z" podStartSLOduration=1.36294618 podStartE2EDuration="1.36294618s" podCreationTimestamp="2026-04-21 16:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:07:18.362287809 +0000 UTC m=+326.738469621" watchObservedRunningTime="2026-04-21 16:07:18.36294618 +0000 UTC m=+326.739128000" Apr 21 16:07:28.082586 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:28.082546 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:28.083245 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:28.082649 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:28.087440 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:28.087416 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:28.373146 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:28.373120 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:07:28.432940 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:28.432913 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56c54fcdb8-sl4xc"] Apr 21 16:07:53.456958 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.456844 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56c54fcdb8-sl4xc" podUID="b4fc923e-6920-476c-99f1-28fdc1bfb30c" containerName="console" containerID="cri-o://84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e" gracePeriod=15 Apr 21 16:07:53.694851 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.694828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56c54fcdb8-sl4xc_b4fc923e-6920-476c-99f1-28fdc1bfb30c/console/0.log" Apr 21 16:07:53.694964 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.694907 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:07:53.794871 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.794804 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-service-ca\") pod \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " Apr 21 16:07:53.794871 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.794834 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-oauth-config\") pod \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " Apr 21 16:07:53.795057 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.794877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-oauth-serving-cert\") pod \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " Apr 21 16:07:53.795057 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.794900 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-config\") pod \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " Apr 21 16:07:53.795057 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.794948 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-trusted-ca-bundle\") pod \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " Apr 21 16:07:53.795057 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.794968 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-serving-cert\") pod \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " Apr 21 16:07:53.795057 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.794985 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bltgf\" (UniqueName: \"kubernetes.io/projected/b4fc923e-6920-476c-99f1-28fdc1bfb30c-kube-api-access-bltgf\") pod \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\" (UID: \"b4fc923e-6920-476c-99f1-28fdc1bfb30c\") " Apr 21 16:07:53.795335 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.795300 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b4fc923e-6920-476c-99f1-28fdc1bfb30c" (UID: "b4fc923e-6920-476c-99f1-28fdc1bfb30c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:07:53.795390 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.795338 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-config" (OuterVolumeSpecName: "console-config") pod "b4fc923e-6920-476c-99f1-28fdc1bfb30c" (UID: "b4fc923e-6920-476c-99f1-28fdc1bfb30c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:07:53.795498 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.795389 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b4fc923e-6920-476c-99f1-28fdc1bfb30c" (UID: "b4fc923e-6920-476c-99f1-28fdc1bfb30c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:07:53.795577 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.795549 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-service-ca" (OuterVolumeSpecName: "service-ca") pod "b4fc923e-6920-476c-99f1-28fdc1bfb30c" (UID: "b4fc923e-6920-476c-99f1-28fdc1bfb30c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:07:53.797040 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.797015 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b4fc923e-6920-476c-99f1-28fdc1bfb30c" (UID: "b4fc923e-6920-476c-99f1-28fdc1bfb30c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:07:53.797128 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.797034 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b4fc923e-6920-476c-99f1-28fdc1bfb30c" (UID: "b4fc923e-6920-476c-99f1-28fdc1bfb30c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:07:53.797128 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.797105 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4fc923e-6920-476c-99f1-28fdc1bfb30c-kube-api-access-bltgf" (OuterVolumeSpecName: "kube-api-access-bltgf") pod "b4fc923e-6920-476c-99f1-28fdc1bfb30c" (UID: "b4fc923e-6920-476c-99f1-28fdc1bfb30c"). InnerVolumeSpecName "kube-api-access-bltgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:07:53.896062 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.896036 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:07:53.896062 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.896060 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bltgf\" (UniqueName: \"kubernetes.io/projected/b4fc923e-6920-476c-99f1-28fdc1bfb30c-kube-api-access-bltgf\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:07:53.896199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.896075 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-service-ca\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:07:53.896199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.896087 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-oauth-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:07:53.896199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.896099 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-oauth-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:07:53.896199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.896113 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-console-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:07:53.896199 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:53.896127 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4fc923e-6920-476c-99f1-28fdc1bfb30c-trusted-ca-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:07:54.447079 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.447050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56c54fcdb8-sl4xc_b4fc923e-6920-476c-99f1-28fdc1bfb30c/console/0.log" Apr 21 16:07:54.447216 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.447091 2576 generic.go:358] "Generic (PLEG): container finished" podID="b4fc923e-6920-476c-99f1-28fdc1bfb30c" containerID="84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e" exitCode=2 Apr 21 16:07:54.447216 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.447161 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c54fcdb8-sl4xc" Apr 21 16:07:54.447216 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.447180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c54fcdb8-sl4xc" event={"ID":"b4fc923e-6920-476c-99f1-28fdc1bfb30c","Type":"ContainerDied","Data":"84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e"} Apr 21 16:07:54.447335 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.447220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c54fcdb8-sl4xc" event={"ID":"b4fc923e-6920-476c-99f1-28fdc1bfb30c","Type":"ContainerDied","Data":"2c075575921dbe51e959720a98bdac722e78a369906c1067427d4d67dc109985"} Apr 21 16:07:54.447335 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.447236 2576 scope.go:117] "RemoveContainer" containerID="84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e" Apr 21 16:07:54.455042 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.455023 2576 scope.go:117] "RemoveContainer" containerID="84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e" Apr 21 16:07:54.455293 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:07:54.455273 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e\": container with ID starting with 84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e not found: ID does not exist" containerID="84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e" Apr 21 16:07:54.455337 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.455300 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e"} err="failed to get container status \"84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e\": rpc error: code = NotFound desc = could not find container \"84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e\": container with ID starting with 84823f28c3ca9160232d53474d1d19bff3aebc18db873d65e156814576c61e9e not found: ID does not exist" Apr 21 16:07:54.466167 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.466145 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56c54fcdb8-sl4xc"] Apr 21 16:07:54.470191 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:54.470168 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56c54fcdb8-sl4xc"] Apr 21 16:07:56.251673 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:07:56.251632 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4fc923e-6920-476c-99f1-28fdc1bfb30c" path="/var/lib/kubelet/pods/b4fc923e-6920-476c-99f1-28fdc1bfb30c/volumes" Apr 21 16:08:07.632887 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.632841 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fh7mj"] Apr 21 16:08:07.633316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.633186 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4fc923e-6920-476c-99f1-28fdc1bfb30c" containerName="console" Apr 21 16:08:07.633316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.633198 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4fc923e-6920-476c-99f1-28fdc1bfb30c" containerName="console" Apr 21 16:08:07.633316 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.633256 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4fc923e-6920-476c-99f1-28fdc1bfb30c" containerName="console" Apr 21 16:08:07.637653 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.637635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.641051 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.641029 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 16:08:07.662895 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.662871 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fh7mj"] Apr 21 16:08:07.805445 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.805413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-kubelet-config\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.805635 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.805540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-original-pull-secret\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.805635 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.805601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-dbus\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.905989 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.905961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-dbus\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.906167 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.906005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-kubelet-config\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.906167 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.906079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-original-pull-secret\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.906167 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.906135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-dbus\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.906167 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.906147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-kubelet-config\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.908351 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.908329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac5d67c0-6512-49e3-99ae-4ea362c0dbe3-original-pull-secret\") pod \"global-pull-secret-syncer-fh7mj\" (UID: \"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3\") " pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:07.946321 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:07.946288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fh7mj" Apr 21 16:08:08.066731 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:08.066702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fh7mj"] Apr 21 16:08:08.068776 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:08:08.068743 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5d67c0_6512_49e3_99ae_4ea362c0dbe3.slice/crio-11a353d503ab86ba3209b852a324a00b96c53b98e5339d684ec6931986778e62 WatchSource:0}: Error finding container 11a353d503ab86ba3209b852a324a00b96c53b98e5339d684ec6931986778e62: Status 404 returned error can't find the container with id 11a353d503ab86ba3209b852a324a00b96c53b98e5339d684ec6931986778e62 Apr 21 16:08:08.489163 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:08.489133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fh7mj" event={"ID":"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3","Type":"ContainerStarted","Data":"11a353d503ab86ba3209b852a324a00b96c53b98e5339d684ec6931986778e62"} Apr 21 16:08:12.505058 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:12.505019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fh7mj" event={"ID":"ac5d67c0-6512-49e3-99ae-4ea362c0dbe3","Type":"ContainerStarted","Data":"f97711d58da513c2bdcbcf07ad03a85ed8c08bc897127a97ae870902001b55fb"} Apr 21 16:08:12.535509 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:12.535464 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fh7mj" podStartSLOduration=2.006987274 podStartE2EDuration="5.535452113s" podCreationTimestamp="2026-04-21 16:08:07 +0000 UTC" firstStartedPulling="2026-04-21 16:08:08.070422686 +0000 UTC m=+376.446604490" lastFinishedPulling="2026-04-21 16:08:11.598887539 +0000 UTC m=+379.975069329" observedRunningTime="2026-04-21 16:08:12.534065028 +0000 UTC m=+380.910246840" watchObservedRunningTime="2026-04-21 16:08:12.535452113 +0000 UTC m=+380.911633923" Apr 21 16:08:21.495213 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.495171 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz"] Apr 21 16:08:21.498676 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.498658 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.501802 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.501779 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-24zmv\"" Apr 21 16:08:21.501957 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.501788 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 16:08:21.502745 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.502730 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 16:08:21.511349 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.511324 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz"] Apr 21 16:08:21.611581 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.611547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.611780 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.611670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.611780 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.611705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q4dz\" (UniqueName: \"kubernetes.io/projected/00ef3ada-eb08-468c-89e7-1771a4122fae-kube-api-access-9q4dz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.712913 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.712848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q4dz\" (UniqueName: \"kubernetes.io/projected/00ef3ada-eb08-468c-89e7-1771a4122fae-kube-api-access-9q4dz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.713109 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.712948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.713109 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.713026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.713329 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.713304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.713404 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.713353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.722899 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.722872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q4dz\" (UniqueName: \"kubernetes.io/projected/00ef3ada-eb08-468c-89e7-1771a4122fae-kube-api-access-9q4dz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.808298 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.808240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:21.927514 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:21.927482 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz"] Apr 21 16:08:21.928907 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:08:21.928882 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ef3ada_eb08_468c_89e7_1771a4122fae.slice/crio-3441898a181c307887e69a41b28c27612146febbbe7c3b68a35ee679eb793c22 WatchSource:0}: Error finding container 3441898a181c307887e69a41b28c27612146febbbe7c3b68a35ee679eb793c22: Status 404 returned error can't find the container with id 3441898a181c307887e69a41b28c27612146febbbe7c3b68a35ee679eb793c22 Apr 21 16:08:22.533913 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:22.533879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" event={"ID":"00ef3ada-eb08-468c-89e7-1771a4122fae","Type":"ContainerStarted","Data":"3441898a181c307887e69a41b28c27612146febbbe7c3b68a35ee679eb793c22"} Apr 21 16:08:27.553173 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:27.553137 2576 generic.go:358] "Generic (PLEG): container finished" podID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerID="8881899aa752c345e8b72e8e9d0ddbe86cfc6888ec83e67d29e50874b1ad3af4" exitCode=0 Apr 21 16:08:27.553547 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:27.553179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" event={"ID":"00ef3ada-eb08-468c-89e7-1771a4122fae","Type":"ContainerDied","Data":"8881899aa752c345e8b72e8e9d0ddbe86cfc6888ec83e67d29e50874b1ad3af4"} Apr 21 16:08:29.560431 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:29.560398 2576 generic.go:358] "Generic (PLEG): container finished" podID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerID="15c8c358364ad2594b5b33af66f01041e3e1987c59ebe553410d2b055eb8e345" exitCode=0 Apr 21 16:08:29.560733 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:29.560453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" event={"ID":"00ef3ada-eb08-468c-89e7-1771a4122fae","Type":"ContainerDied","Data":"15c8c358364ad2594b5b33af66f01041e3e1987c59ebe553410d2b055eb8e345"} Apr 21 16:08:35.583266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:35.583233 2576 generic.go:358] "Generic (PLEG): container finished" podID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerID="53a3874cc49df7f299e6b532d85c8fe6ae6b8c90b9eac389c8b498223cde76db" exitCode=0 Apr 21 16:08:35.583558 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:35.583281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" event={"ID":"00ef3ada-eb08-468c-89e7-1771a4122fae","Type":"ContainerDied","Data":"53a3874cc49df7f299e6b532d85c8fe6ae6b8c90b9eac389c8b498223cde76db"} Apr 21 16:08:36.741630 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.741603 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:36.835237 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.835199 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-util\") pod \"00ef3ada-eb08-468c-89e7-1771a4122fae\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " Apr 21 16:08:36.835237 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.835247 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-bundle\") pod \"00ef3ada-eb08-468c-89e7-1771a4122fae\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " Apr 21 16:08:36.835454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.835265 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q4dz\" (UniqueName: \"kubernetes.io/projected/00ef3ada-eb08-468c-89e7-1771a4122fae-kube-api-access-9q4dz\") pod \"00ef3ada-eb08-468c-89e7-1771a4122fae\" (UID: \"00ef3ada-eb08-468c-89e7-1771a4122fae\") " Apr 21 16:08:36.835915 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.835848 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-bundle" (OuterVolumeSpecName: "bundle") pod "00ef3ada-eb08-468c-89e7-1771a4122fae" (UID: "00ef3ada-eb08-468c-89e7-1771a4122fae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:08:36.837504 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.837442 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ef3ada-eb08-468c-89e7-1771a4122fae-kube-api-access-9q4dz" (OuterVolumeSpecName: "kube-api-access-9q4dz") pod "00ef3ada-eb08-468c-89e7-1771a4122fae" (UID: "00ef3ada-eb08-468c-89e7-1771a4122fae"). InnerVolumeSpecName "kube-api-access-9q4dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:08:36.839219 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.839190 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-util" (OuterVolumeSpecName: "util") pod "00ef3ada-eb08-468c-89e7-1771a4122fae" (UID: "00ef3ada-eb08-468c-89e7-1771a4122fae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:08:36.936346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.936322 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:08:36.936346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.936346 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ef3ada-eb08-468c-89e7-1771a4122fae-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:08:36.936468 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:36.936359 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9q4dz\" (UniqueName: \"kubernetes.io/projected/00ef3ada-eb08-468c-89e7-1771a4122fae-kube-api-access-9q4dz\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:08:37.590611 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:37.590574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" event={"ID":"00ef3ada-eb08-468c-89e7-1771a4122fae","Type":"ContainerDied","Data":"3441898a181c307887e69a41b28c27612146febbbe7c3b68a35ee679eb793c22"} Apr 21 16:08:37.590611 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:37.590606 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9ncpz" Apr 21 16:08:37.590611 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:37.590613 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3441898a181c307887e69a41b28c27612146febbbe7c3b68a35ee679eb793c22" Apr 21 16:08:45.538810 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.538779 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w"] Apr 21 16:08:45.539267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.539094 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerName="extract" Apr 21 16:08:45.539267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.539105 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerName="extract" Apr 21 16:08:45.539267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.539117 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerName="pull" Apr 21 16:08:45.539267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.539136 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerName="pull" Apr 21 16:08:45.539267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.539146 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerName="util" Apr 21 16:08:45.539267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.539152 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerName="util" Apr 21 16:08:45.539267 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.539217 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="00ef3ada-eb08-468c-89e7-1771a4122fae" containerName="extract" Apr 21 16:08:45.571880 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.571836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w"] Apr 21 16:08:45.572003 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.571950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:45.574914 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.574890 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 16:08:45.575098 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.575081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-vklt6\"" Apr 21 16:08:45.575170 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.575114 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:08:45.701161 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.701132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hn4d\" (UniqueName: \"kubernetes.io/projected/c84e18ba-16a1-482f-8d6f-2eceb864e20d-kube-api-access-5hn4d\") pod \"cert-manager-operator-controller-manager-54b9655956-jkz2w\" (UID: \"c84e18ba-16a1-482f-8d6f-2eceb864e20d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:45.701285 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.701183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c84e18ba-16a1-482f-8d6f-2eceb864e20d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-jkz2w\" (UID: \"c84e18ba-16a1-482f-8d6f-2eceb864e20d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:45.802366 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.802308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hn4d\" (UniqueName: \"kubernetes.io/projected/c84e18ba-16a1-482f-8d6f-2eceb864e20d-kube-api-access-5hn4d\") pod \"cert-manager-operator-controller-manager-54b9655956-jkz2w\" (UID: \"c84e18ba-16a1-482f-8d6f-2eceb864e20d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:45.802366 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.802359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c84e18ba-16a1-482f-8d6f-2eceb864e20d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-jkz2w\" (UID: \"c84e18ba-16a1-482f-8d6f-2eceb864e20d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:45.802685 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.802670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c84e18ba-16a1-482f-8d6f-2eceb864e20d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-jkz2w\" (UID: \"c84e18ba-16a1-482f-8d6f-2eceb864e20d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:45.812463 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.812443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hn4d\" (UniqueName: \"kubernetes.io/projected/c84e18ba-16a1-482f-8d6f-2eceb864e20d-kube-api-access-5hn4d\") pod \"cert-manager-operator-controller-manager-54b9655956-jkz2w\" (UID: \"c84e18ba-16a1-482f-8d6f-2eceb864e20d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:45.881806 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:45.881781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" Apr 21 16:08:46.014553 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:46.014525 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w"] Apr 21 16:08:46.019097 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:08:46.019066 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc84e18ba_16a1_482f_8d6f_2eceb864e20d.slice/crio-8f17194d22fff9bfc11def355a0b7f2bd19d571f1d5378863b998bc76e0e5d7b WatchSource:0}: Error finding container 8f17194d22fff9bfc11def355a0b7f2bd19d571f1d5378863b998bc76e0e5d7b: Status 404 returned error can't find the container with id 8f17194d22fff9bfc11def355a0b7f2bd19d571f1d5378863b998bc76e0e5d7b Apr 21 16:08:46.623795 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:46.623751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" event={"ID":"c84e18ba-16a1-482f-8d6f-2eceb864e20d","Type":"ContainerStarted","Data":"8f17194d22fff9bfc11def355a0b7f2bd19d571f1d5378863b998bc76e0e5d7b"} Apr 21 16:08:48.633905 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:48.633868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" event={"ID":"c84e18ba-16a1-482f-8d6f-2eceb864e20d","Type":"ContainerStarted","Data":"a3fc763c76209abf2fe59fc13cec267026c4bcd7bea33db71b987e14df9e16b6"} Apr 21 16:08:48.692604 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:48.692552 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jkz2w" podStartSLOduration=1.817364727 podStartE2EDuration="3.692539964s" podCreationTimestamp="2026-04-21 16:08:45 +0000 UTC" firstStartedPulling="2026-04-21 16:08:46.021746211 +0000 UTC m=+414.397928000" lastFinishedPulling="2026-04-21 16:08:47.896921445 +0000 UTC m=+416.273103237" observedRunningTime="2026-04-21 16:08:48.691431933 +0000 UTC m=+417.067613745" watchObservedRunningTime="2026-04-21 16:08:48.692539964 +0000 UTC m=+417.068721774" Apr 21 16:08:49.582679 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.582648 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm"] Apr 21 16:08:49.602710 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.602689 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.607301 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.607277 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm"] Apr 21 16:08:49.608585 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.608527 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 16:08:49.608585 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.608530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 16:08:49.608944 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.608928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-24zmv\"" Apr 21 16:08:49.734450 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.734422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-924jl\" (UniqueName: \"kubernetes.io/projected/0aee83ee-f791-404c-be46-2bb64f55c862-kube-api-access-924jl\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.734794 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.734478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.734794 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.734628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.835271 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.835219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-924jl\" (UniqueName: \"kubernetes.io/projected/0aee83ee-f791-404c-be46-2bb64f55c862-kube-api-access-924jl\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.835271 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.835262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.835421 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.835315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.835714 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.835696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.835780 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.835734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.851794 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.851768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-924jl\" (UniqueName: \"kubernetes.io/projected/0aee83ee-f791-404c-be46-2bb64f55c862-kube-api-access-924jl\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:49.912238 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:49.912218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:50.053129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:50.053007 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm"] Apr 21 16:08:50.055915 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:08:50.055888 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aee83ee_f791_404c_be46_2bb64f55c862.slice/crio-3e42a0c8d95193074d70f4ddba6728f9f0c8d3b1f78797dfe876ae3efb3c7e6e WatchSource:0}: Error finding container 3e42a0c8d95193074d70f4ddba6728f9f0c8d3b1f78797dfe876ae3efb3c7e6e: Status 404 returned error can't find the container with id 3e42a0c8d95193074d70f4ddba6728f9f0c8d3b1f78797dfe876ae3efb3c7e6e Apr 21 16:08:50.642138 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:50.642098 2576 generic.go:358] "Generic (PLEG): container finished" podID="0aee83ee-f791-404c-be46-2bb64f55c862" containerID="141b85d24977269168caf4d6a14ab79d3365be59c23774ecb2414757bec08a0d" exitCode=0 Apr 21 16:08:50.642326 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:50.642184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" event={"ID":"0aee83ee-f791-404c-be46-2bb64f55c862","Type":"ContainerDied","Data":"141b85d24977269168caf4d6a14ab79d3365be59c23774ecb2414757bec08a0d"} Apr 21 16:08:50.642326 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:50.642228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" event={"ID":"0aee83ee-f791-404c-be46-2bb64f55c862","Type":"ContainerStarted","Data":"3e42a0c8d95193074d70f4ddba6728f9f0c8d3b1f78797dfe876ae3efb3c7e6e"} Apr 21 16:08:51.756314 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.756281 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-m2wb4"] Apr 21 16:08:51.758827 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.758806 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:51.761454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.761435 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 16:08:51.761564 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.761439 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-drstv\"" Apr 21 16:08:51.762539 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.762519 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 16:08:51.776213 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.776191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-m2wb4"] Apr 21 16:08:51.847974 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.847951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-m2wb4\" (UID: \"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:51.848072 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.847991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgvj2\" (UniqueName: \"kubernetes.io/projected/8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8-kube-api-access-hgvj2\") pod \"cert-manager-webhook-587ccfb98-m2wb4\" (UID: \"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:51.949264 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.949242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-m2wb4\" (UID: \"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:51.949361 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.949283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgvj2\" (UniqueName: \"kubernetes.io/projected/8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8-kube-api-access-hgvj2\") pod \"cert-manager-webhook-587ccfb98-m2wb4\" (UID: \"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:51.969155 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.969134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-m2wb4\" (UID: \"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:51.970159 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:51.970140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgvj2\" (UniqueName: \"kubernetes.io/projected/8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8-kube-api-access-hgvj2\") pod \"cert-manager-webhook-587ccfb98-m2wb4\" (UID: \"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:52.079310 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:52.079263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:52.212508 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:52.212485 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-m2wb4"] Apr 21 16:08:52.214514 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:08:52.214487 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fcf2dad_0f10_4c2d_8967_bde8f5ac6ec8.slice/crio-3ff11182acdecda05be82baf52abcc6840ff8bb4d2d77e0894a5e362d7f167d5 WatchSource:0}: Error finding container 3ff11182acdecda05be82baf52abcc6840ff8bb4d2d77e0894a5e362d7f167d5: Status 404 returned error can't find the container with id 3ff11182acdecda05be82baf52abcc6840ff8bb4d2d77e0894a5e362d7f167d5 Apr 21 16:08:52.650290 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:52.650252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" event={"ID":"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8","Type":"ContainerStarted","Data":"3ff11182acdecda05be82baf52abcc6840ff8bb4d2d77e0894a5e362d7f167d5"} Apr 21 16:08:55.665737 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:55.665681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" event={"ID":"8fcf2dad-0f10-4c2d-8967-bde8f5ac6ec8","Type":"ContainerStarted","Data":"4a4eeffce1a35196e643698b9de8d10b44ca78243e0b41bde5cc1077b8ad22bb"} Apr 21 16:08:55.665737 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:55.665740 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:08:55.667625 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:55.667595 2576 generic.go:358] "Generic (PLEG): container finished" podID="0aee83ee-f791-404c-be46-2bb64f55c862" containerID="0c0fa0444e7233cabb6cfe6882ee4dc40f053d19459fc6e16b2122e48a170086" exitCode=0 Apr 21 16:08:55.667764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:55.667685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" event={"ID":"0aee83ee-f791-404c-be46-2bb64f55c862","Type":"ContainerDied","Data":"0c0fa0444e7233cabb6cfe6882ee4dc40f053d19459fc6e16b2122e48a170086"} Apr 21 16:08:55.688439 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:55.688395 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" podStartSLOduration=1.6266471660000001 podStartE2EDuration="4.688381443s" podCreationTimestamp="2026-04-21 16:08:51 +0000 UTC" firstStartedPulling="2026-04-21 16:08:52.216270422 +0000 UTC m=+420.592452211" lastFinishedPulling="2026-04-21 16:08:55.278004696 +0000 UTC m=+423.654186488" observedRunningTime="2026-04-21 16:08:55.686299093 +0000 UTC m=+424.062480905" watchObservedRunningTime="2026-04-21 16:08:55.688381443 +0000 UTC m=+424.064563254" Apr 21 16:08:56.328004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.327965 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-mpfpb"] Apr 21 16:08:56.330920 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.330890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.335432 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.335410 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-4czk7\"" Apr 21 16:08:56.343018 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.342996 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-mpfpb"] Apr 21 16:08:56.486288 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.486261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vslfm\" (UniqueName: \"kubernetes.io/projected/6007fa92-cc16-4ebf-b942-40e95ee54834-kube-api-access-vslfm\") pod \"cert-manager-cainjector-68b757865b-mpfpb\" (UID: \"6007fa92-cc16-4ebf-b942-40e95ee54834\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.486399 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.486298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6007fa92-cc16-4ebf-b942-40e95ee54834-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-mpfpb\" (UID: \"6007fa92-cc16-4ebf-b942-40e95ee54834\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.586659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.586603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vslfm\" (UniqueName: \"kubernetes.io/projected/6007fa92-cc16-4ebf-b942-40e95ee54834-kube-api-access-vslfm\") pod \"cert-manager-cainjector-68b757865b-mpfpb\" (UID: \"6007fa92-cc16-4ebf-b942-40e95ee54834\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.586659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.586638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6007fa92-cc16-4ebf-b942-40e95ee54834-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-mpfpb\" (UID: \"6007fa92-cc16-4ebf-b942-40e95ee54834\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.598774 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.598753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vslfm\" (UniqueName: \"kubernetes.io/projected/6007fa92-cc16-4ebf-b942-40e95ee54834-kube-api-access-vslfm\") pod \"cert-manager-cainjector-68b757865b-mpfpb\" (UID: \"6007fa92-cc16-4ebf-b942-40e95ee54834\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.598853 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.598815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6007fa92-cc16-4ebf-b942-40e95ee54834-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-mpfpb\" (UID: \"6007fa92-cc16-4ebf-b942-40e95ee54834\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.640436 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.640414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" Apr 21 16:08:56.672561 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.672535 2576 generic.go:358] "Generic (PLEG): container finished" podID="0aee83ee-f791-404c-be46-2bb64f55c862" containerID="c7bf2b972f811c3c88f99913e63857ed7c3af2252369e5cf42bbfa41d7f5f78c" exitCode=0 Apr 21 16:08:56.672871 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.672614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" event={"ID":"0aee83ee-f791-404c-be46-2bb64f55c862","Type":"ContainerDied","Data":"c7bf2b972f811c3c88f99913e63857ed7c3af2252369e5cf42bbfa41d7f5f78c"} Apr 21 16:08:56.766819 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:56.766796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-mpfpb"] Apr 21 16:08:56.768749 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:08:56.768723 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6007fa92_cc16_4ebf_b942_40e95ee54834.slice/crio-28a84221cc17c9ac44975e01cfa60ef92999bb0acf51a3d2d3be43884f1ae11d WatchSource:0}: Error finding container 28a84221cc17c9ac44975e01cfa60ef92999bb0acf51a3d2d3be43884f1ae11d: Status 404 returned error can't find the container with id 28a84221cc17c9ac44975e01cfa60ef92999bb0acf51a3d2d3be43884f1ae11d Apr 21 16:08:57.676703 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.676672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" event={"ID":"6007fa92-cc16-4ebf-b942-40e95ee54834","Type":"ContainerStarted","Data":"fbc978dcd09865e0e3f3d27d7a45242c5d7e5a10d01837549c2014bc6fe48ae9"} Apr 21 16:08:57.677173 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.676710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" event={"ID":"6007fa92-cc16-4ebf-b942-40e95ee54834","Type":"ContainerStarted","Data":"28a84221cc17c9ac44975e01cfa60ef92999bb0acf51a3d2d3be43884f1ae11d"} Apr 21 16:08:57.694079 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.694037 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-mpfpb" podStartSLOduration=1.694022805 podStartE2EDuration="1.694022805s" podCreationTimestamp="2026-04-21 16:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:08:57.693018484 +0000 UTC m=+426.069200297" watchObservedRunningTime="2026-04-21 16:08:57.694022805 +0000 UTC m=+426.070204616" Apr 21 16:08:57.804671 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.804650 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:08:57.896056 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.896033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-924jl\" (UniqueName: \"kubernetes.io/projected/0aee83ee-f791-404c-be46-2bb64f55c862-kube-api-access-924jl\") pod \"0aee83ee-f791-404c-be46-2bb64f55c862\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " Apr 21 16:08:57.896176 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.896115 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-util\") pod \"0aee83ee-f791-404c-be46-2bb64f55c862\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " Apr 21 16:08:57.896260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.896225 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-bundle\") pod \"0aee83ee-f791-404c-be46-2bb64f55c862\" (UID: \"0aee83ee-f791-404c-be46-2bb64f55c862\") " Apr 21 16:08:57.896564 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.896540 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-bundle" (OuterVolumeSpecName: "bundle") pod "0aee83ee-f791-404c-be46-2bb64f55c862" (UID: "0aee83ee-f791-404c-be46-2bb64f55c862"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:08:57.898082 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.898065 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aee83ee-f791-404c-be46-2bb64f55c862-kube-api-access-924jl" (OuterVolumeSpecName: "kube-api-access-924jl") pod "0aee83ee-f791-404c-be46-2bb64f55c862" (UID: "0aee83ee-f791-404c-be46-2bb64f55c862"). InnerVolumeSpecName "kube-api-access-924jl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:08:57.902540 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.902519 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-util" (OuterVolumeSpecName: "util") pod "0aee83ee-f791-404c-be46-2bb64f55c862" (UID: "0aee83ee-f791-404c-be46-2bb64f55c862"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:08:57.997215 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.997157 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:08:57.997215 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.997180 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-924jl\" (UniqueName: \"kubernetes.io/projected/0aee83ee-f791-404c-be46-2bb64f55c862-kube-api-access-924jl\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:08:57.997215 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:57.997193 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0aee83ee-f791-404c-be46-2bb64f55c862-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:08:58.681475 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:58.681433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" event={"ID":"0aee83ee-f791-404c-be46-2bb64f55c862","Type":"ContainerDied","Data":"3e42a0c8d95193074d70f4ddba6728f9f0c8d3b1f78797dfe876ae3efb3c7e6e"} Apr 21 16:08:58.681475 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:58.681471 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e42a0c8d95193074d70f4ddba6728f9f0c8d3b1f78797dfe876ae3efb3c7e6e" Apr 21 16:08:58.682052 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:08:58.681592 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fsgttm" Apr 21 16:09:01.675334 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:01.675300 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-m2wb4" Apr 21 16:09:08.617817 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.617777 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb"] Apr 21 16:09:08.618572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.618549 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aee83ee-f791-404c-be46-2bb64f55c862" containerName="util" Apr 21 16:09:08.618672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.618576 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aee83ee-f791-404c-be46-2bb64f55c862" containerName="util" Apr 21 16:09:08.618672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.618624 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aee83ee-f791-404c-be46-2bb64f55c862" containerName="extract" Apr 21 16:09:08.618672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.618633 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aee83ee-f791-404c-be46-2bb64f55c862" containerName="extract" Apr 21 16:09:08.618672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.618664 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aee83ee-f791-404c-be46-2bb64f55c862" containerName="pull" Apr 21 16:09:08.618672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.618673 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aee83ee-f791-404c-be46-2bb64f55c862" containerName="pull" Apr 21 16:09:08.618900 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.618821 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aee83ee-f791-404c-be46-2bb64f55c862" containerName="extract" Apr 21 16:09:08.622793 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.622765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.626067 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.626043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 16:09:08.626170 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.626120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-24zmv\"" Apr 21 16:09:08.626750 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.626735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 16:09:08.629383 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.629361 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb"] Apr 21 16:09:08.781696 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.781671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5fwf\" (UniqueName: \"kubernetes.io/projected/8381a432-1014-4a7b-ae44-1de854984bbd-kube-api-access-k5fwf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.781838 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.781707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.781838 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.781732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.883048 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.882983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.883161 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.883076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5fwf\" (UniqueName: \"kubernetes.io/projected/8381a432-1014-4a7b-ae44-1de854984bbd-kube-api-access-k5fwf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.883161 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.883118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.883352 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.883331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.883457 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.883437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.893490 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.893464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5fwf\" (UniqueName: \"kubernetes.io/projected/8381a432-1014-4a7b-ae44-1de854984bbd-kube-api-access-k5fwf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:08.933283 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:08.933261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:09.289061 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:09.289033 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb"] Apr 21 16:09:09.292110 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:09:09.292080 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8381a432_1014_4a7b_ae44_1de854984bbd.slice/crio-593257e44ab0363f5834fc9e745f7f02758ed5380cff1806aa452caa1f6f2012 WatchSource:0}: Error finding container 593257e44ab0363f5834fc9e745f7f02758ed5380cff1806aa452caa1f6f2012: Status 404 returned error can't find the container with id 593257e44ab0363f5834fc9e745f7f02758ed5380cff1806aa452caa1f6f2012 Apr 21 16:09:09.721933 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:09.721833 2576 generic.go:358] "Generic (PLEG): container finished" podID="8381a432-1014-4a7b-ae44-1de854984bbd" containerID="10a6078abab9c18d45773edab8eea992969adf08e6ed19d37e0d9b394ede45b6" exitCode=0 Apr 21 16:09:09.722593 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:09.721952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" event={"ID":"8381a432-1014-4a7b-ae44-1de854984bbd","Type":"ContainerDied","Data":"10a6078abab9c18d45773edab8eea992969adf08e6ed19d37e0d9b394ede45b6"} Apr 21 16:09:09.722593 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:09.722015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" event={"ID":"8381a432-1014-4a7b-ae44-1de854984bbd","Type":"ContainerStarted","Data":"593257e44ab0363f5834fc9e745f7f02758ed5380cff1806aa452caa1f6f2012"} Apr 21 16:09:10.727691 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:10.727623 2576 generic.go:358] "Generic (PLEG): container finished" podID="8381a432-1014-4a7b-ae44-1de854984bbd" containerID="7d5402945e7fa0f1cbdc839847fc7171257850a4defbc8763b92bdb62d679bf4" exitCode=0 Apr 21 16:09:10.728019 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:10.727719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" event={"ID":"8381a432-1014-4a7b-ae44-1de854984bbd","Type":"ContainerDied","Data":"7d5402945e7fa0f1cbdc839847fc7171257850a4defbc8763b92bdb62d679bf4"} Apr 21 16:09:11.732832 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:11.732790 2576 generic.go:358] "Generic (PLEG): container finished" podID="8381a432-1014-4a7b-ae44-1de854984bbd" containerID="5bcc0ffeb626bde1c4d9c8fc224d52a852c3915f6e2e5c474af147313f8b668b" exitCode=0 Apr 21 16:09:11.733212 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:11.732881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" event={"ID":"8381a432-1014-4a7b-ae44-1de854984bbd","Type":"ContainerDied","Data":"5bcc0ffeb626bde1c4d9c8fc224d52a852c3915f6e2e5c474af147313f8b668b"} Apr 21 16:09:12.857537 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:12.857513 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:12.915791 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:12.915767 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-util\") pod \"8381a432-1014-4a7b-ae44-1de854984bbd\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " Apr 21 16:09:12.915939 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:12.915800 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5fwf\" (UniqueName: \"kubernetes.io/projected/8381a432-1014-4a7b-ae44-1de854984bbd-kube-api-access-k5fwf\") pod \"8381a432-1014-4a7b-ae44-1de854984bbd\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " Apr 21 16:09:12.915939 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:12.915839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-bundle\") pod \"8381a432-1014-4a7b-ae44-1de854984bbd\" (UID: \"8381a432-1014-4a7b-ae44-1de854984bbd\") " Apr 21 16:09:12.916564 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:12.916534 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-bundle" (OuterVolumeSpecName: "bundle") pod "8381a432-1014-4a7b-ae44-1de854984bbd" (UID: "8381a432-1014-4a7b-ae44-1de854984bbd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:09:12.917753 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:12.917729 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8381a432-1014-4a7b-ae44-1de854984bbd-kube-api-access-k5fwf" (OuterVolumeSpecName: "kube-api-access-k5fwf") pod "8381a432-1014-4a7b-ae44-1de854984bbd" (UID: "8381a432-1014-4a7b-ae44-1de854984bbd"). InnerVolumeSpecName "kube-api-access-k5fwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:09:12.921182 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:12.921145 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-util" (OuterVolumeSpecName: "util") pod "8381a432-1014-4a7b-ae44-1de854984bbd" (UID: "8381a432-1014-4a7b-ae44-1de854984bbd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:09:13.016351 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:13.016301 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:13.016351 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:13.016322 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5fwf\" (UniqueName: \"kubernetes.io/projected/8381a432-1014-4a7b-ae44-1de854984bbd-kube-api-access-k5fwf\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:13.016351 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:13.016333 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8381a432-1014-4a7b-ae44-1de854984bbd-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:13.743003 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:13.742963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" event={"ID":"8381a432-1014-4a7b-ae44-1de854984bbd","Type":"ContainerDied","Data":"593257e44ab0363f5834fc9e745f7f02758ed5380cff1806aa452caa1f6f2012"} Apr 21 16:09:13.743003 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:13.742999 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5bwpsb" Apr 21 16:09:13.743266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:13.743004 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593257e44ab0363f5834fc9e745f7f02758ed5380cff1806aa452caa1f6f2012" Apr 21 16:09:23.388261 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388231 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv"] Apr 21 16:09:23.388622 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388538 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8381a432-1014-4a7b-ae44-1de854984bbd" containerName="pull" Apr 21 16:09:23.388622 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388549 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8381a432-1014-4a7b-ae44-1de854984bbd" containerName="pull" Apr 21 16:09:23.388622 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388557 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8381a432-1014-4a7b-ae44-1de854984bbd" containerName="util" Apr 21 16:09:23.388622 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388564 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8381a432-1014-4a7b-ae44-1de854984bbd" containerName="util" Apr 21 16:09:23.388622 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388569 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8381a432-1014-4a7b-ae44-1de854984bbd" containerName="extract" Apr 21 16:09:23.388622 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388576 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8381a432-1014-4a7b-ae44-1de854984bbd" containerName="extract" Apr 21 16:09:23.388809 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.388654 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8381a432-1014-4a7b-ae44-1de854984bbd" containerName="extract" Apr 21 16:09:23.393417 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.393397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.397327 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.397310 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-24zmv\"" Apr 21 16:09:23.398271 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.398252 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 16:09:23.398338 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.398288 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 16:09:23.409207 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.409181 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv"] Apr 21 16:09:23.482682 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.482656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.482787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.482706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.482876 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.482788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s9p8\" (UniqueName: \"kubernetes.io/projected/a80d5ead-f6c5-4be1-84e2-e14d35161f93-kube-api-access-5s9p8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.583743 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.583700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.583960 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.583796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s9p8\" (UniqueName: \"kubernetes.io/projected/a80d5ead-f6c5-4be1-84e2-e14d35161f93-kube-api-access-5s9p8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.584031 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.583967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.584256 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.584231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.584442 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.584321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.596518 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.596494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s9p8\" (UniqueName: \"kubernetes.io/projected/a80d5ead-f6c5-4be1-84e2-e14d35161f93-kube-api-access-5s9p8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.706512 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.706439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:23.848235 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:23.847286 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv"] Apr 21 16:09:23.852033 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:09:23.851997 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda80d5ead_f6c5_4be1_84e2_e14d35161f93.slice/crio-9d3c2ce0ca0b6c2564ce7633f9ddde83e4f6d31c17b9bb1b59fe57c37379a899 WatchSource:0}: Error finding container 9d3c2ce0ca0b6c2564ce7633f9ddde83e4f6d31c17b9bb1b59fe57c37379a899: Status 404 returned error can't find the container with id 9d3c2ce0ca0b6c2564ce7633f9ddde83e4f6d31c17b9bb1b59fe57c37379a899 Apr 21 16:09:24.784617 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:24.784581 2576 generic.go:358] "Generic (PLEG): container finished" podID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerID="ce05179c3b934a19554435df1d0df20131309204362227ebb084b86f6e4ce58f" exitCode=0 Apr 21 16:09:24.785044 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:24.784671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" event={"ID":"a80d5ead-f6c5-4be1-84e2-e14d35161f93","Type":"ContainerDied","Data":"ce05179c3b934a19554435df1d0df20131309204362227ebb084b86f6e4ce58f"} Apr 21 16:09:24.785044 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:24.784712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" event={"ID":"a80d5ead-f6c5-4be1-84e2-e14d35161f93","Type":"ContainerStarted","Data":"9d3c2ce0ca0b6c2564ce7633f9ddde83e4f6d31c17b9bb1b59fe57c37379a899"} Apr 21 16:09:25.260479 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.260446 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87"] Apr 21 16:09:25.263677 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.263660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.266283 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.266228 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 16:09:25.266422 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.266286 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5mxtt\"" Apr 21 16:09:25.266422 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.266311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 16:09:25.266634 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.266616 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 16:09:25.266699 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.266656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 16:09:25.282372 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.282350 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87"] Apr 21 16:09:25.297488 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.297468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vlbl\" (UniqueName: \"kubernetes.io/projected/8b028677-6d59-4d75-9eb8-5758225d6d88-kube-api-access-5vlbl\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.297586 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.297520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b028677-6d59-4d75-9eb8-5758225d6d88-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.297586 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.297556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b028677-6d59-4d75-9eb8-5758225d6d88-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.398266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.398239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vlbl\" (UniqueName: \"kubernetes.io/projected/8b028677-6d59-4d75-9eb8-5758225d6d88-kube-api-access-5vlbl\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.398430 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.398295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b028677-6d59-4d75-9eb8-5758225d6d88-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.398430 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.398352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b028677-6d59-4d75-9eb8-5758225d6d88-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.400884 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.400841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b028677-6d59-4d75-9eb8-5758225d6d88-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.400978 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.400886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b028677-6d59-4d75-9eb8-5758225d6d88-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.407262 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.407242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vlbl\" (UniqueName: \"kubernetes.io/projected/8b028677-6d59-4d75-9eb8-5758225d6d88-kube-api-access-5vlbl\") pod \"opendatahub-operator-controller-manager-774f54dc87-vqs87\" (UID: \"8b028677-6d59-4d75-9eb8-5758225d6d88\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.574243 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.574171 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:25.701411 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.701379 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh"] Apr 21 16:09:25.705557 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.705531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.710338 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.710319 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 16:09:25.710566 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.710551 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 16:09:25.710811 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.710796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-mmz8p\"" Apr 21 16:09:25.711130 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.711115 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 16:09:25.711327 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.711315 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 16:09:25.711934 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.711917 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:09:25.717670 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.717646 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh"] Apr 21 16:09:25.727988 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.727966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87"] Apr 21 16:09:25.730387 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:09:25.730357 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b028677_6d59_4d75_9eb8_5758225d6d88.slice/crio-7b5f9aa81b0125e00228d335f79ae2a80e6cc5cd3bdaee444ed5e72d7d67046d WatchSource:0}: Error finding container 7b5f9aa81b0125e00228d335f79ae2a80e6cc5cd3bdaee444ed5e72d7d67046d: Status 404 returned error can't find the container with id 7b5f9aa81b0125e00228d335f79ae2a80e6cc5cd3bdaee444ed5e72d7d67046d Apr 21 16:09:25.788437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.788409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" event={"ID":"8b028677-6d59-4d75-9eb8-5758225d6d88","Type":"ContainerStarted","Data":"7b5f9aa81b0125e00228d335f79ae2a80e6cc5cd3bdaee444ed5e72d7d67046d"} Apr 21 16:09:25.789949 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.789927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" event={"ID":"a80d5ead-f6c5-4be1-84e2-e14d35161f93","Type":"ContainerStarted","Data":"83929f6741760dd75c6cc4b10ad5e50bd4b80ea042fc2f9c4ef245d6fd8204fb"} Apr 21 16:09:25.801710 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.801688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/597d1320-fab9-47a1-8230-333f5695b010-metrics-cert\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.801804 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.801729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597d1320-fab9-47a1-8230-333f5695b010-cert\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.801848 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.801826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bbfr\" (UniqueName: \"kubernetes.io/projected/597d1320-fab9-47a1-8230-333f5695b010-kube-api-access-2bbfr\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.801917 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.801852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/597d1320-fab9-47a1-8230-333f5695b010-manager-config\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.902790 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.902766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bbfr\" (UniqueName: \"kubernetes.io/projected/597d1320-fab9-47a1-8230-333f5695b010-kube-api-access-2bbfr\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.902941 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.902798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/597d1320-fab9-47a1-8230-333f5695b010-manager-config\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.902941 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.902835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/597d1320-fab9-47a1-8230-333f5695b010-metrics-cert\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.902941 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.902873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597d1320-fab9-47a1-8230-333f5695b010-cert\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.903492 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.903469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/597d1320-fab9-47a1-8230-333f5695b010-manager-config\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.905202 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.905183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/597d1320-fab9-47a1-8230-333f5695b010-metrics-cert\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.905300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.905261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597d1320-fab9-47a1-8230-333f5695b010-cert\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:25.911847 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:25.911823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bbfr\" (UniqueName: \"kubernetes.io/projected/597d1320-fab9-47a1-8230-333f5695b010-kube-api-access-2bbfr\") pod \"lws-controller-manager-d98d6987c-5clzh\" (UID: \"597d1320-fab9-47a1-8230-333f5695b010\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:26.015690 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:26.015670 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:26.187317 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:09:26.187197 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod597d1320_fab9_47a1_8230_333f5695b010.slice/crio-16f669d612f520d264bb8aa1d7c4d1124ef57eb1fb99a1ab07a778915a3ac1d8 WatchSource:0}: Error finding container 16f669d612f520d264bb8aa1d7c4d1124ef57eb1fb99a1ab07a778915a3ac1d8: Status 404 returned error can't find the container with id 16f669d612f520d264bb8aa1d7c4d1124ef57eb1fb99a1ab07a778915a3ac1d8 Apr 21 16:09:26.188720 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:26.188697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh"] Apr 21 16:09:26.797289 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:26.797250 2576 generic.go:358] "Generic (PLEG): container finished" podID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerID="83929f6741760dd75c6cc4b10ad5e50bd4b80ea042fc2f9c4ef245d6fd8204fb" exitCode=0 Apr 21 16:09:26.797786 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:26.797342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" event={"ID":"a80d5ead-f6c5-4be1-84e2-e14d35161f93","Type":"ContainerDied","Data":"83929f6741760dd75c6cc4b10ad5e50bd4b80ea042fc2f9c4ef245d6fd8204fb"} Apr 21 16:09:26.798938 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:26.798887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" event={"ID":"597d1320-fab9-47a1-8230-333f5695b010","Type":"ContainerStarted","Data":"16f669d612f520d264bb8aa1d7c4d1124ef57eb1fb99a1ab07a778915a3ac1d8"} Apr 21 16:09:27.809628 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:27.809595 2576 generic.go:358] "Generic (PLEG): container finished" podID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerID="b9233022631106c4ade851b79a7ba033e50c94f49e1edd03738ca2a8368b8bd7" exitCode=0 Apr 21 16:09:27.810060 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:27.809682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" event={"ID":"a80d5ead-f6c5-4be1-84e2-e14d35161f93","Type":"ContainerDied","Data":"b9233022631106c4ade851b79a7ba033e50c94f49e1edd03738ca2a8368b8bd7"} Apr 21 16:09:28.946526 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:28.946506 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:29.027276 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.027216 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-bundle\") pod \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " Apr 21 16:09:29.027448 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.027326 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-util\") pod \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " Apr 21 16:09:29.027448 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.027362 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s9p8\" (UniqueName: \"kubernetes.io/projected/a80d5ead-f6c5-4be1-84e2-e14d35161f93-kube-api-access-5s9p8\") pod \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\" (UID: \"a80d5ead-f6c5-4be1-84e2-e14d35161f93\") " Apr 21 16:09:29.028571 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.028359 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-bundle" (OuterVolumeSpecName: "bundle") pod "a80d5ead-f6c5-4be1-84e2-e14d35161f93" (UID: "a80d5ead-f6c5-4be1-84e2-e14d35161f93"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:09:29.030000 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.029976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a80d5ead-f6c5-4be1-84e2-e14d35161f93-kube-api-access-5s9p8" (OuterVolumeSpecName: "kube-api-access-5s9p8") pod "a80d5ead-f6c5-4be1-84e2-e14d35161f93" (UID: "a80d5ead-f6c5-4be1-84e2-e14d35161f93"). InnerVolumeSpecName "kube-api-access-5s9p8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:09:29.055491 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.055466 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-util" (OuterVolumeSpecName: "util") pod "a80d5ead-f6c5-4be1-84e2-e14d35161f93" (UID: "a80d5ead-f6c5-4be1-84e2-e14d35161f93"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:09:29.128290 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.128220 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:29.128290 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.128247 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a80d5ead-f6c5-4be1-84e2-e14d35161f93-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:29.128290 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.128257 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5s9p8\" (UniqueName: \"kubernetes.io/projected/a80d5ead-f6c5-4be1-84e2-e14d35161f93-kube-api-access-5s9p8\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:29.817285 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.817239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" event={"ID":"8b028677-6d59-4d75-9eb8-5758225d6d88","Type":"ContainerStarted","Data":"03499155ebce1fc7feb481b5b28fd9b1662cd6c6e9831312b34ae517fea1607a"} Apr 21 16:09:29.817496 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.817361 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:29.818836 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.818811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" event={"ID":"a80d5ead-f6c5-4be1-84e2-e14d35161f93","Type":"ContainerDied","Data":"9d3c2ce0ca0b6c2564ce7633f9ddde83e4f6d31c17b9bb1b59fe57c37379a899"} Apr 21 16:09:29.818958 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.818841 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d3c2ce0ca0b6c2564ce7633f9ddde83e4f6d31c17b9bb1b59fe57c37379a899" Apr 21 16:09:29.818958 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.818820 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9tqgzv" Apr 21 16:09:29.820247 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.820225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" event={"ID":"597d1320-fab9-47a1-8230-333f5695b010","Type":"ContainerStarted","Data":"9a0820bca9db47baab93a722b4025f129f848f7928ebb65f65ced0aa86aaa5f8"} Apr 21 16:09:29.820354 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.820339 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:29.839777 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.839738 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" podStartSLOduration=1.725371827 podStartE2EDuration="4.839727076s" podCreationTimestamp="2026-04-21 16:09:25 +0000 UTC" firstStartedPulling="2026-04-21 16:09:25.732084225 +0000 UTC m=+454.108266028" lastFinishedPulling="2026-04-21 16:09:28.846439488 +0000 UTC m=+457.222621277" observedRunningTime="2026-04-21 16:09:29.838318549 +0000 UTC m=+458.214500361" watchObservedRunningTime="2026-04-21 16:09:29.839727076 +0000 UTC m=+458.215908886" Apr 21 16:09:29.860980 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:29.860946 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" podStartSLOduration=2.21081774 podStartE2EDuration="4.860937215s" podCreationTimestamp="2026-04-21 16:09:25 +0000 UTC" firstStartedPulling="2026-04-21 16:09:26.189283136 +0000 UTC m=+454.565464926" lastFinishedPulling="2026-04-21 16:09:28.839402596 +0000 UTC m=+457.215584401" observedRunningTime="2026-04-21 16:09:29.860705179 +0000 UTC m=+458.236886989" watchObservedRunningTime="2026-04-21 16:09:29.860937215 +0000 UTC m=+458.237119047" Apr 21 16:09:40.826954 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:40.826919 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-5clzh" Apr 21 16:09:40.827421 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:40.827134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-vqs87" Apr 21 16:09:43.166672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.166634 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m"] Apr 21 16:09:43.167401 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.167378 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerName="extract" Apr 21 16:09:43.167476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.167409 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerName="extract" Apr 21 16:09:43.167476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.167435 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerName="pull" Apr 21 16:09:43.167476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.167445 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerName="pull" Apr 21 16:09:43.167476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.167462 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerName="util" Apr 21 16:09:43.167476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.167471 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerName="util" Apr 21 16:09:43.167687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.167638 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a80d5ead-f6c5-4be1-84e2-e14d35161f93" containerName="extract" Apr 21 16:09:43.171331 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.171307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.187409 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.187389 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-24zmv\"" Apr 21 16:09:43.188356 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.188341 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 16:09:43.194774 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.194756 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 16:09:43.236235 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.236209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.236346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.236253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.236346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.236325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4gzq\" (UniqueName: \"kubernetes.io/projected/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-kube-api-access-z4gzq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.237642 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.237621 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m"] Apr 21 16:09:43.337561 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.337525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.337680 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.337579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.337746 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.337704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4gzq\" (UniqueName: \"kubernetes.io/projected/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-kube-api-access-z4gzq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.338069 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.338046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.338154 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.338094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.354716 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.354685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4gzq\" (UniqueName: \"kubernetes.io/projected/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-kube-api-access-z4gzq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.480335 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.480126 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:43.693660 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.693560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m"] Apr 21 16:09:43.696590 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:09:43.696534 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3689a33f_3911_4037_acaf_ee9dd1bb9f6e.slice/crio-ab26e7e5847dc99d0207d94a26e61335d0ff9580e93698dc47b06b7bff0df488 WatchSource:0}: Error finding container ab26e7e5847dc99d0207d94a26e61335d0ff9580e93698dc47b06b7bff0df488: Status 404 returned error can't find the container with id ab26e7e5847dc99d0207d94a26e61335d0ff9580e93698dc47b06b7bff0df488 Apr 21 16:09:43.878418 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.878345 2576 generic.go:358] "Generic (PLEG): container finished" podID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerID="7c44b36f2a8093ad6bdb436170ec4ebdec74e56c6a9806fb356e22cd0a3129b1" exitCode=0 Apr 21 16:09:43.878418 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.878402 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" event={"ID":"3689a33f-3911-4037-acaf-ee9dd1bb9f6e","Type":"ContainerDied","Data":"7c44b36f2a8093ad6bdb436170ec4ebdec74e56c6a9806fb356e22cd0a3129b1"} Apr 21 16:09:43.878817 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:43.878460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" event={"ID":"3689a33f-3911-4037-acaf-ee9dd1bb9f6e","Type":"ContainerStarted","Data":"ab26e7e5847dc99d0207d94a26e61335d0ff9580e93698dc47b06b7bff0df488"} Apr 21 16:09:44.883873 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:44.883830 2576 generic.go:358] "Generic (PLEG): container finished" podID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerID="320831c4d33eafc72309ad834a8a778b5465a751c14e52e8cb4565c0cd070aa7" exitCode=0 Apr 21 16:09:44.884224 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:44.883894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" event={"ID":"3689a33f-3911-4037-acaf-ee9dd1bb9f6e","Type":"ContainerDied","Data":"320831c4d33eafc72309ad834a8a778b5465a751c14e52e8cb4565c0cd070aa7"} Apr 21 16:09:45.889815 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:45.889780 2576 generic.go:358] "Generic (PLEG): container finished" podID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerID="a7d6c06bef8d6dc85e549cdfd095883ad45030f30b8f981ac8b7b6898b64d16f" exitCode=0 Apr 21 16:09:45.890234 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:45.889822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" event={"ID":"3689a33f-3911-4037-acaf-ee9dd1bb9f6e","Type":"ContainerDied","Data":"a7d6c06bef8d6dc85e549cdfd095883ad45030f30b8f981ac8b7b6898b64d16f"} Apr 21 16:09:47.018713 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.018688 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:47.070308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.070279 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-bundle\") pod \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " Apr 21 16:09:47.070471 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.070366 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-util\") pod \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " Apr 21 16:09:47.070471 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.070413 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4gzq\" (UniqueName: \"kubernetes.io/projected/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-kube-api-access-z4gzq\") pod \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\" (UID: \"3689a33f-3911-4037-acaf-ee9dd1bb9f6e\") " Apr 21 16:09:47.071229 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.071204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-bundle" (OuterVolumeSpecName: "bundle") pod "3689a33f-3911-4037-acaf-ee9dd1bb9f6e" (UID: "3689a33f-3911-4037-acaf-ee9dd1bb9f6e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:09:47.072531 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.072508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-kube-api-access-z4gzq" (OuterVolumeSpecName: "kube-api-access-z4gzq") pod "3689a33f-3911-4037-acaf-ee9dd1bb9f6e" (UID: "3689a33f-3911-4037-acaf-ee9dd1bb9f6e"). InnerVolumeSpecName "kube-api-access-z4gzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:09:47.078914 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.078885 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-util" (OuterVolumeSpecName: "util") pod "3689a33f-3911-4037-acaf-ee9dd1bb9f6e" (UID: "3689a33f-3911-4037-acaf-ee9dd1bb9f6e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:09:47.171889 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.171814 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:47.171889 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.171837 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4gzq\" (UniqueName: \"kubernetes.io/projected/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-kube-api-access-z4gzq\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:47.171889 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.171848 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3689a33f-3911-4037-acaf-ee9dd1bb9f6e-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:09:47.898870 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.898839 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" Apr 21 16:09:47.899037 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.898828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sbj6m" event={"ID":"3689a33f-3911-4037-acaf-ee9dd1bb9f6e","Type":"ContainerDied","Data":"ab26e7e5847dc99d0207d94a26e61335d0ff9580e93698dc47b06b7bff0df488"} Apr 21 16:09:47.899037 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:47.898977 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab26e7e5847dc99d0207d94a26e61335d0ff9580e93698dc47b06b7bff0df488" Apr 21 16:09:52.588747 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.588718 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2"] Apr 21 16:09:52.589150 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.589114 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerName="util" Apr 21 16:09:52.589150 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.589128 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerName="util" Apr 21 16:09:52.589150 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.589148 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerName="pull" Apr 21 16:09:52.589252 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.589153 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerName="pull" Apr 21 16:09:52.589252 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.589161 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerName="extract" Apr 21 16:09:52.589252 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.589167 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerName="extract" Apr 21 16:09:52.589252 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.589226 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3689a33f-3911-4037-acaf-ee9dd1bb9f6e" containerName="extract" Apr 21 16:09:52.592213 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.592192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.600530 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.600515 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-24zmv\"" Apr 21 16:09:52.601320 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.601306 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 16:09:52.625098 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.625079 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 16:09:52.713751 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.713727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kt2k\" (UniqueName: \"kubernetes.io/projected/54f30169-e920-42cb-8180-38d97001d712-kube-api-access-4kt2k\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.713848 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.713755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.713848 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.713781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.810466 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.810443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2"] Apr 21 16:09:52.814445 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.814423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kt2k\" (UniqueName: \"kubernetes.io/projected/54f30169-e920-42cb-8180-38d97001d712-kube-api-access-4kt2k\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.814530 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.814455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.814530 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.814477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.814868 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.814837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.814903 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.814883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.889554 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.889515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kt2k\" (UniqueName: \"kubernetes.io/projected/54f30169-e920-42cb-8180-38d97001d712-kube-api-access-4kt2k\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:52.901312 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:52.901292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:09:53.044937 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:53.044910 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2"] Apr 21 16:09:53.047159 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:09:53.047127 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54f30169_e920_42cb_8180_38d97001d712.slice/crio-a7ee0c8f212c5ed7216e7330991e81db9e3203299abac7a0756de69664cbc77a WatchSource:0}: Error finding container a7ee0c8f212c5ed7216e7330991e81db9e3203299abac7a0756de69664cbc77a: Status 404 returned error can't find the container with id a7ee0c8f212c5ed7216e7330991e81db9e3203299abac7a0756de69664cbc77a Apr 21 16:09:53.933478 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:53.933440 2576 generic.go:358] "Generic (PLEG): container finished" podID="54f30169-e920-42cb-8180-38d97001d712" containerID="89c6830c5e4ee76c3351d75162356ccc0c6891174533dd070f9263f148c89070" exitCode=0 Apr 21 16:09:53.934014 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:53.933551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" event={"ID":"54f30169-e920-42cb-8180-38d97001d712","Type":"ContainerDied","Data":"89c6830c5e4ee76c3351d75162356ccc0c6891174533dd070f9263f148c89070"} Apr 21 16:09:53.934014 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:09:53.933580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" event={"ID":"54f30169-e920-42cb-8180-38d97001d712","Type":"ContainerStarted","Data":"a7ee0c8f212c5ed7216e7330991e81db9e3203299abac7a0756de69664cbc77a"} Apr 21 16:10:02.967934 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:02.967900 2576 generic.go:358] "Generic (PLEG): container finished" podID="54f30169-e920-42cb-8180-38d97001d712" containerID="d10e04ddc997a112c7336070ff7b88daa6c2c7e4b1758f348df4ee66892ca08c" exitCode=0 Apr 21 16:10:02.968313 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:02.967994 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" event={"ID":"54f30169-e920-42cb-8180-38d97001d712","Type":"ContainerDied","Data":"d10e04ddc997a112c7336070ff7b88daa6c2c7e4b1758f348df4ee66892ca08c"} Apr 21 16:10:03.973743 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:03.973711 2576 generic.go:358] "Generic (PLEG): container finished" podID="54f30169-e920-42cb-8180-38d97001d712" containerID="0bad7a02d504d2e0cfdb56ad8a2970420fb6ede8fc2f21ead8221876965b7c5a" exitCode=0 Apr 21 16:10:03.974135 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:03.973765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" event={"ID":"54f30169-e920-42cb-8180-38d97001d712","Type":"ContainerDied","Data":"0bad7a02d504d2e0cfdb56ad8a2970420fb6ede8fc2f21ead8221876965b7c5a"} Apr 21 16:10:05.095567 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.095539 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:10:05.228144 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.228077 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kt2k\" (UniqueName: \"kubernetes.io/projected/54f30169-e920-42cb-8180-38d97001d712-kube-api-access-4kt2k\") pod \"54f30169-e920-42cb-8180-38d97001d712\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " Apr 21 16:10:05.228144 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.228141 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-util\") pod \"54f30169-e920-42cb-8180-38d97001d712\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " Apr 21 16:10:05.228329 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.228169 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-bundle\") pod \"54f30169-e920-42cb-8180-38d97001d712\" (UID: \"54f30169-e920-42cb-8180-38d97001d712\") " Apr 21 16:10:05.229085 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.229055 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-bundle" (OuterVolumeSpecName: "bundle") pod "54f30169-e920-42cb-8180-38d97001d712" (UID: "54f30169-e920-42cb-8180-38d97001d712"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:10:05.230282 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.230255 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f30169-e920-42cb-8180-38d97001d712-kube-api-access-4kt2k" (OuterVolumeSpecName: "kube-api-access-4kt2k") pod "54f30169-e920-42cb-8180-38d97001d712" (UID: "54f30169-e920-42cb-8180-38d97001d712"). InnerVolumeSpecName "kube-api-access-4kt2k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:10:05.233408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.233385 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-util" (OuterVolumeSpecName: "util") pod "54f30169-e920-42cb-8180-38d97001d712" (UID: "54f30169-e920-42cb-8180-38d97001d712"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:10:05.328708 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.328682 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kt2k\" (UniqueName: \"kubernetes.io/projected/54f30169-e920-42cb-8180-38d97001d712-kube-api-access-4kt2k\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:10:05.328708 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.328702 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:10:05.328708 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.328711 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54f30169-e920-42cb-8180-38d97001d712-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:10:05.983299 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.983267 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" Apr 21 16:10:05.983479 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.983261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wvsh2" event={"ID":"54f30169-e920-42cb-8180-38d97001d712","Type":"ContainerDied","Data":"a7ee0c8f212c5ed7216e7330991e81db9e3203299abac7a0756de69664cbc77a"} Apr 21 16:10:05.983479 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:05.983387 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7ee0c8f212c5ed7216e7330991e81db9e3203299abac7a0756de69664cbc77a" Apr 21 16:10:27.618962 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.618930 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft"] Apr 21 16:10:27.619449 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.619433 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f30169-e920-42cb-8180-38d97001d712" containerName="extract" Apr 21 16:10:27.619501 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.619453 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f30169-e920-42cb-8180-38d97001d712" containerName="extract" Apr 21 16:10:27.619501 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.619476 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f30169-e920-42cb-8180-38d97001d712" containerName="pull" Apr 21 16:10:27.619501 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.619484 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f30169-e920-42cb-8180-38d97001d712" containerName="pull" Apr 21 16:10:27.619594 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.619524 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f30169-e920-42cb-8180-38d97001d712" containerName="util" Apr 21 16:10:27.619594 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.619534 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f30169-e920-42cb-8180-38d97001d712" containerName="util" Apr 21 16:10:27.619659 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.619606 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="54f30169-e920-42cb-8180-38d97001d712" containerName="extract" Apr 21 16:10:27.623205 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.623184 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.625755 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.625735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 16:10:27.625907 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.625792 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-cpr7w\"" Apr 21 16:10:27.634604 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.634583 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft"] Apr 21 16:10:27.798185 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/50c064db-1acb-40aa-87af-19f89d6d568c-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798311 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798311 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798311 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/50c064db-1acb-40aa-87af-19f89d6d568c-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/50c064db-1acb-40aa-87af-19f89d6d568c-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.798593 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.798462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzz8x\" (UniqueName: \"kubernetes.io/projected/50c064db-1acb-40aa-87af-19f89d6d568c-kube-api-access-lzz8x\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.899688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/50c064db-1acb-40aa-87af-19f89d6d568c-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.899814 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.899814 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.899814 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzz8x\" (UniqueName: \"kubernetes.io/projected/50c064db-1acb-40aa-87af-19f89d6d568c-kube-api-access-lzz8x\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.899814 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/50c064db-1acb-40aa-87af-19f89d6d568c-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900063 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900063 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900063 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/50c064db-1acb-40aa-87af-19f89d6d568c-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900063 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.899946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900214 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.900134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900214 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.900168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900332 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.900314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900445 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.900426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.900510 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.900493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/50c064db-1acb-40aa-87af-19f89d6d568c-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.902172 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.902144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/50c064db-1acb-40aa-87af-19f89d6d568c-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.902243 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.902202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/50c064db-1acb-40aa-87af-19f89d6d568c-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.908603 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.908578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/50c064db-1acb-40aa-87af-19f89d6d568c-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.908793 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.908770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzz8x\" (UniqueName: \"kubernetes.io/projected/50c064db-1acb-40aa-87af-19f89d6d568c-kube-api-access-lzz8x\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft\" (UID: \"50c064db-1acb-40aa-87af-19f89d6d568c\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:27.934574 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:27.934548 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:28.061655 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:28.061625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft"] Apr 21 16:10:28.066077 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:10:28.063359 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50c064db_1acb_40aa_87af_19f89d6d568c.slice/crio-3cbd727bb26400b1e5b6cd13af4f6dfb1bbf3fb369f69d43c0fa391d1845704c WatchSource:0}: Error finding container 3cbd727bb26400b1e5b6cd13af4f6dfb1bbf3fb369f69d43c0fa391d1845704c: Status 404 returned error can't find the container with id 3cbd727bb26400b1e5b6cd13af4f6dfb1bbf3fb369f69d43c0fa391d1845704c Apr 21 16:10:29.073876 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:29.073820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" event={"ID":"50c064db-1acb-40aa-87af-19f89d6d568c","Type":"ContainerStarted","Data":"3cbd727bb26400b1e5b6cd13af4f6dfb1bbf3fb369f69d43c0fa391d1845704c"} Apr 21 16:10:30.746335 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:30.746293 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 16:10:30.746595 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:30.746384 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 16:10:30.746595 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:30.746416 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 16:10:31.083164 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:31.083086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" event={"ID":"50c064db-1acb-40aa-87af-19f89d6d568c","Type":"ContainerStarted","Data":"16224e72b7365f5fea72a334702de3513b0a69aa091c1041aab25d66a4f75cd5"} Apr 21 16:10:31.107079 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:31.107030 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" podStartSLOduration=1.42869155 podStartE2EDuration="4.107017826s" podCreationTimestamp="2026-04-21 16:10:27 +0000 UTC" firstStartedPulling="2026-04-21 16:10:28.067708033 +0000 UTC m=+516.443889822" lastFinishedPulling="2026-04-21 16:10:30.746034302 +0000 UTC m=+519.122216098" observedRunningTime="2026-04-21 16:10:31.105729489 +0000 UTC m=+519.481911301" watchObservedRunningTime="2026-04-21 16:10:31.107017826 +0000 UTC m=+519.483199695" Apr 21 16:10:31.935474 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:31.935438 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:31.940022 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:31.939998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:32.086695 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:32.086674 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:32.087436 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:32.087421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft" Apr 21 16:10:43.796782 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.796703 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lkk4q"] Apr 21 16:10:43.799999 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.799982 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" Apr 21 16:10:43.802638 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.802617 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 16:10:43.803687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.803669 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 16:10:43.803764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.803687 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-hbkkh\"" Apr 21 16:10:43.811103 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.811078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lkk4q"] Apr 21 16:10:43.812545 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.812527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbqhh\" (UniqueName: \"kubernetes.io/projected/71bc54b0-2825-438a-81d0-5042c17fe43e-kube-api-access-wbqhh\") pod \"kuadrant-operator-catalog-lkk4q\" (UID: \"71bc54b0-2825-438a-81d0-5042c17fe43e\") " pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" Apr 21 16:10:43.913815 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.913789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbqhh\" (UniqueName: \"kubernetes.io/projected/71bc54b0-2825-438a-81d0-5042c17fe43e-kube-api-access-wbqhh\") pod \"kuadrant-operator-catalog-lkk4q\" (UID: \"71bc54b0-2825-438a-81d0-5042c17fe43e\") " pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" Apr 21 16:10:43.922667 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:43.922633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbqhh\" (UniqueName: \"kubernetes.io/projected/71bc54b0-2825-438a-81d0-5042c17fe43e-kube-api-access-wbqhh\") pod \"kuadrant-operator-catalog-lkk4q\" (UID: \"71bc54b0-2825-438a-81d0-5042c17fe43e\") " pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" Apr 21 16:10:44.111026 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.110964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" Apr 21 16:10:44.166812 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.166783 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lkk4q"] Apr 21 16:10:44.232901 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.232876 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lkk4q"] Apr 21 16:10:44.234458 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:10:44.234428 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bc54b0_2825_438a_81d0_5042c17fe43e.slice/crio-4e1b7a86a0918107e34616876e103e284b95f711b2fb3011a8bec63044a87a3d WatchSource:0}: Error finding container 4e1b7a86a0918107e34616876e103e284b95f711b2fb3011a8bec63044a87a3d: Status 404 returned error can't find the container with id 4e1b7a86a0918107e34616876e103e284b95f711b2fb3011a8bec63044a87a3d Apr 21 16:10:44.373293 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.373231 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-jlgz6"] Apr 21 16:10:44.377574 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.377559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:44.385843 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.385823 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-jlgz6"] Apr 21 16:10:44.420165 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.420143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66h9c\" (UniqueName: \"kubernetes.io/projected/008018a6-2152-4abe-b3dc-0fca25d4db0d-kube-api-access-66h9c\") pod \"kuadrant-operator-catalog-jlgz6\" (UID: \"008018a6-2152-4abe-b3dc-0fca25d4db0d\") " pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:44.521118 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.521090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66h9c\" (UniqueName: \"kubernetes.io/projected/008018a6-2152-4abe-b3dc-0fca25d4db0d-kube-api-access-66h9c\") pod \"kuadrant-operator-catalog-jlgz6\" (UID: \"008018a6-2152-4abe-b3dc-0fca25d4db0d\") " pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:44.529454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.529422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66h9c\" (UniqueName: \"kubernetes.io/projected/008018a6-2152-4abe-b3dc-0fca25d4db0d-kube-api-access-66h9c\") pod \"kuadrant-operator-catalog-jlgz6\" (UID: \"008018a6-2152-4abe-b3dc-0fca25d4db0d\") " pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:44.687967 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.687942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:44.805490 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:44.805466 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-jlgz6"] Apr 21 16:10:44.807839 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:10:44.807813 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008018a6_2152_4abe_b3dc_0fca25d4db0d.slice/crio-cf455a9a9c782e9b2a4e99a955bc906f180c4b8fff39d2ee49b12606e9516da5 WatchSource:0}: Error finding container cf455a9a9c782e9b2a4e99a955bc906f180c4b8fff39d2ee49b12606e9516da5: Status 404 returned error can't find the container with id cf455a9a9c782e9b2a4e99a955bc906f180c4b8fff39d2ee49b12606e9516da5 Apr 21 16:10:45.137311 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:45.137279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" event={"ID":"71bc54b0-2825-438a-81d0-5042c17fe43e","Type":"ContainerStarted","Data":"4e1b7a86a0918107e34616876e103e284b95f711b2fb3011a8bec63044a87a3d"} Apr 21 16:10:45.138358 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:45.138329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" event={"ID":"008018a6-2152-4abe-b3dc-0fca25d4db0d","Type":"ContainerStarted","Data":"cf455a9a9c782e9b2a4e99a955bc906f180c4b8fff39d2ee49b12606e9516da5"} Apr 21 16:10:47.148417 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.148384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" event={"ID":"008018a6-2152-4abe-b3dc-0fca25d4db0d","Type":"ContainerStarted","Data":"04faf022e236d562999b6cebebea81a813d6bfccee73bd5dd3e98df698ed4c1d"} Apr 21 16:10:47.149949 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.149919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" event={"ID":"71bc54b0-2825-438a-81d0-5042c17fe43e","Type":"ContainerStarted","Data":"15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2"} Apr 21 16:10:47.150081 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.149998 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" podUID="71bc54b0-2825-438a-81d0-5042c17fe43e" containerName="registry-server" containerID="cri-o://15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2" gracePeriod=2 Apr 21 16:10:47.176300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.176255 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" podStartSLOduration=1.271814051 podStartE2EDuration="3.176241916s" podCreationTimestamp="2026-04-21 16:10:44 +0000 UTC" firstStartedPulling="2026-04-21 16:10:44.809187262 +0000 UTC m=+533.185369051" lastFinishedPulling="2026-04-21 16:10:46.713615127 +0000 UTC m=+535.089796916" observedRunningTime="2026-04-21 16:10:47.173685161 +0000 UTC m=+535.549866997" watchObservedRunningTime="2026-04-21 16:10:47.176241916 +0000 UTC m=+535.552423765" Apr 21 16:10:47.218595 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.218557 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" podStartSLOduration=1.74376047 podStartE2EDuration="4.218543751s" podCreationTimestamp="2026-04-21 16:10:43 +0000 UTC" firstStartedPulling="2026-04-21 16:10:44.235926493 +0000 UTC m=+532.612108288" lastFinishedPulling="2026-04-21 16:10:46.710709766 +0000 UTC m=+535.086891569" observedRunningTime="2026-04-21 16:10:47.214829936 +0000 UTC m=+535.591011747" watchObservedRunningTime="2026-04-21 16:10:47.218543751 +0000 UTC m=+535.594725577" Apr 21 16:10:47.400889 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.400829 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" Apr 21 16:10:47.447964 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.447935 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbqhh\" (UniqueName: \"kubernetes.io/projected/71bc54b0-2825-438a-81d0-5042c17fe43e-kube-api-access-wbqhh\") pod \"71bc54b0-2825-438a-81d0-5042c17fe43e\" (UID: \"71bc54b0-2825-438a-81d0-5042c17fe43e\") " Apr 21 16:10:47.450251 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.450223 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bc54b0-2825-438a-81d0-5042c17fe43e-kube-api-access-wbqhh" (OuterVolumeSpecName: "kube-api-access-wbqhh") pod "71bc54b0-2825-438a-81d0-5042c17fe43e" (UID: "71bc54b0-2825-438a-81d0-5042c17fe43e"). InnerVolumeSpecName "kube-api-access-wbqhh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:10:47.549385 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:47.549358 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbqhh\" (UniqueName: \"kubernetes.io/projected/71bc54b0-2825-438a-81d0-5042c17fe43e-kube-api-access-wbqhh\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:10:48.154615 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.154581 2576 generic.go:358] "Generic (PLEG): container finished" podID="71bc54b0-2825-438a-81d0-5042c17fe43e" containerID="15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2" exitCode=0 Apr 21 16:10:48.155004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.154639 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" Apr 21 16:10:48.155004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.154669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" event={"ID":"71bc54b0-2825-438a-81d0-5042c17fe43e","Type":"ContainerDied","Data":"15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2"} Apr 21 16:10:48.155004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.154709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lkk4q" event={"ID":"71bc54b0-2825-438a-81d0-5042c17fe43e","Type":"ContainerDied","Data":"4e1b7a86a0918107e34616876e103e284b95f711b2fb3011a8bec63044a87a3d"} Apr 21 16:10:48.155004 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.154727 2576 scope.go:117] "RemoveContainer" containerID="15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2" Apr 21 16:10:48.163842 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.163826 2576 scope.go:117] "RemoveContainer" containerID="15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2" Apr 21 16:10:48.164143 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:10:48.164122 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2\": container with ID starting with 15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2 not found: ID does not exist" containerID="15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2" Apr 21 16:10:48.164219 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.164150 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2"} err="failed to get container status \"15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2\": rpc error: code = NotFound desc = could not find container \"15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2\": container with ID starting with 15012691c5fc6587bbb1980b59cb14dbbea34d1699d3dd98ec8132ebe6797dc2 not found: ID does not exist" Apr 21 16:10:48.177317 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.177295 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lkk4q"] Apr 21 16:10:48.179160 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.179140 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lkk4q"] Apr 21 16:10:48.252068 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:48.252046 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bc54b0-2825-438a-81d0-5042c17fe43e" path="/var/lib/kubelet/pods/71bc54b0-2825-438a-81d0-5042c17fe43e/volumes" Apr 21 16:10:54.689113 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:54.689072 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:54.689113 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:54.689119 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:54.710883 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:54.710835 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:55.203833 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:55.203799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-jlgz6" Apr 21 16:10:59.009787 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.009756 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5"] Apr 21 16:10:59.010150 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.010112 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71bc54b0-2825-438a-81d0-5042c17fe43e" containerName="registry-server" Apr 21 16:10:59.010150 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.010123 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bc54b0-2825-438a-81d0-5042c17fe43e" containerName="registry-server" Apr 21 16:10:59.010236 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.010177 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="71bc54b0-2825-438a-81d0-5042c17fe43e" containerName="registry-server" Apr 21 16:10:59.014696 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.014680 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.017259 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.017241 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hpb5g\"" Apr 21 16:10:59.023030 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.023010 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5"] Apr 21 16:10:59.141334 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.141306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls2mm\" (UniqueName: \"kubernetes.io/projected/b187f500-8a83-41c6-8672-8216c305049d-kube-api-access-ls2mm\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.141438 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.141361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.141501 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.141435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.242632 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.242596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.242764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.242659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.242764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.242706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ls2mm\" (UniqueName: \"kubernetes.io/projected/b187f500-8a83-41c6-8672-8216c305049d-kube-api-access-ls2mm\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.243002 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.242979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.243086 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.243003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.252161 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.252139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls2mm\" (UniqueName: \"kubernetes.io/projected/b187f500-8a83-41c6-8672-8216c305049d-kube-api-access-ls2mm\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.325125 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.325063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:10:59.416225 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.416194 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9"] Apr 21 16:10:59.421308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.421222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.432146 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.432119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9"] Apr 21 16:10:59.457714 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.457691 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5"] Apr 21 16:10:59.459533 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:10:59.459509 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb187f500_8a83_41c6_8672_8216c305049d.slice/crio-58a5b35285f72d2d97f77e31de019474ad1541a35330717ca3d21193b5c3b121 WatchSource:0}: Error finding container 58a5b35285f72d2d97f77e31de019474ad1541a35330717ca3d21193b5c3b121: Status 404 returned error can't find the container with id 58a5b35285f72d2d97f77e31de019474ad1541a35330717ca3d21193b5c3b121 Apr 21 16:10:59.546442 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.546410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.546602 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.546452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.546602 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.546480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkw99\" (UniqueName: \"kubernetes.io/projected/3bd6de24-3b32-4014-828d-44abd752467b-kube-api-access-bkw99\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.647302 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.647276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.647437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.647325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkw99\" (UniqueName: \"kubernetes.io/projected/3bd6de24-3b32-4014-828d-44abd752467b-kube-api-access-bkw99\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.647437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.647422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.647687 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.647659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.647750 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.647683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.655464 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.655440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkw99\" (UniqueName: \"kubernetes.io/projected/3bd6de24-3b32-4014-828d-44abd752467b-kube-api-access-bkw99\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.735006 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.734979 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:10:59.858989 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:10:59.858956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9"] Apr 21 16:10:59.861226 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:10:59.861198 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd6de24_3b32_4014_828d_44abd752467b.slice/crio-4adaabba6effade327070fdaee4a52bce82658d60811487a3586152c84354d3c WatchSource:0}: Error finding container 4adaabba6effade327070fdaee4a52bce82658d60811487a3586152c84354d3c: Status 404 returned error can't find the container with id 4adaabba6effade327070fdaee4a52bce82658d60811487a3586152c84354d3c Apr 21 16:11:00.014065 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.014039 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p"] Apr 21 16:11:00.017499 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.017484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.026602 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.026582 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p"] Apr 21 16:11:00.150312 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.150287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.150416 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.150365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvscb\" (UniqueName: \"kubernetes.io/projected/cb431080-b815-441e-ae40-a7cbb06c00e8-kube-api-access-qvscb\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.150416 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.150390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.201675 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.201617 2576 generic.go:358] "Generic (PLEG): container finished" podID="b187f500-8a83-41c6-8672-8216c305049d" containerID="e456f634e3d8f11e1acbcdc527650e3283ee0840e5621141dc4e2620a1520ddb" exitCode=0 Apr 21 16:11:00.201755 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.201700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" event={"ID":"b187f500-8a83-41c6-8672-8216c305049d","Type":"ContainerDied","Data":"e456f634e3d8f11e1acbcdc527650e3283ee0840e5621141dc4e2620a1520ddb"} Apr 21 16:11:00.201755 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.201738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" event={"ID":"b187f500-8a83-41c6-8672-8216c305049d","Type":"ContainerStarted","Data":"58a5b35285f72d2d97f77e31de019474ad1541a35330717ca3d21193b5c3b121"} Apr 21 16:11:00.203206 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.203182 2576 generic.go:358] "Generic (PLEG): container finished" podID="3bd6de24-3b32-4014-828d-44abd752467b" containerID="353fed8fee1cd9bd07f51a2dbbfc63c16a0890dc0c623dd67e8640b0a004b40f" exitCode=0 Apr 21 16:11:00.203287 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.203242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" event={"ID":"3bd6de24-3b32-4014-828d-44abd752467b","Type":"ContainerDied","Data":"353fed8fee1cd9bd07f51a2dbbfc63c16a0890dc0c623dd67e8640b0a004b40f"} Apr 21 16:11:00.203287 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.203259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" event={"ID":"3bd6de24-3b32-4014-828d-44abd752467b","Type":"ContainerStarted","Data":"4adaabba6effade327070fdaee4a52bce82658d60811487a3586152c84354d3c"} Apr 21 16:11:00.250721 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.250700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.250840 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.250765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvscb\" (UniqueName: \"kubernetes.io/projected/cb431080-b815-441e-ae40-a7cbb06c00e8-kube-api-access-qvscb\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.250840 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.250783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.251112 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.251077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.251174 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.251160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.259091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.259073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvscb\" (UniqueName: \"kubernetes.io/projected/cb431080-b815-441e-ae40-a7cbb06c00e8-kube-api-access-qvscb\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.327031 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.327010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:00.445734 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.445708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p"] Apr 21 16:11:00.447273 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:11:00.447242 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb431080_b815_441e_ae40_a7cbb06c00e8.slice/crio-c107ac495f85bcb371369d632ee2ad23aedf15aa8ee03f81a7266a8c0733f750 WatchSource:0}: Error finding container c107ac495f85bcb371369d632ee2ad23aedf15aa8ee03f81a7266a8c0733f750: Status 404 returned error can't find the container with id c107ac495f85bcb371369d632ee2ad23aedf15aa8ee03f81a7266a8c0733f750 Apr 21 16:11:00.616549 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.616516 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9"] Apr 21 16:11:00.620263 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.620240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.629141 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.629119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9"] Apr 21 16:11:00.756271 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.756075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.756271 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.756177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.756629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.756297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km7xr\" (UniqueName: \"kubernetes.io/projected/514becdf-1310-4620-9c42-c349a3b1359a-kube-api-access-km7xr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.857656 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.857616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.857827 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.857687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-km7xr\" (UniqueName: \"kubernetes.io/projected/514becdf-1310-4620-9c42-c349a3b1359a-kube-api-access-km7xr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.857827 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.857752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.858034 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.858008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.858072 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.858058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.868525 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.868476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-km7xr\" (UniqueName: \"kubernetes.io/projected/514becdf-1310-4620-9c42-c349a3b1359a-kube-api-access-km7xr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:00.930596 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:00.930512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:01.212470 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:01.212404 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerID="bfc7df85ae9dadf0f9746b0a50fc07d1fd56e77b6604312c2e6ad382252f970c" exitCode=0 Apr 21 16:11:01.213120 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:01.212622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" event={"ID":"cb431080-b815-441e-ae40-a7cbb06c00e8","Type":"ContainerDied","Data":"bfc7df85ae9dadf0f9746b0a50fc07d1fd56e77b6604312c2e6ad382252f970c"} Apr 21 16:11:01.213120 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:01.212689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" event={"ID":"cb431080-b815-441e-ae40-a7cbb06c00e8","Type":"ContainerStarted","Data":"c107ac495f85bcb371369d632ee2ad23aedf15aa8ee03f81a7266a8c0733f750"} Apr 21 16:11:01.360270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:01.360235 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9"] Apr 21 16:11:02.221225 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.221195 2576 generic.go:358] "Generic (PLEG): container finished" podID="3bd6de24-3b32-4014-828d-44abd752467b" containerID="63969a5ed831b30c1750b3fd1e44e52905ef66baa1b0abb5b79c6491f86200f9" exitCode=0 Apr 21 16:11:02.221643 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.221280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" event={"ID":"3bd6de24-3b32-4014-828d-44abd752467b","Type":"ContainerDied","Data":"63969a5ed831b30c1750b3fd1e44e52905ef66baa1b0abb5b79c6491f86200f9"} Apr 21 16:11:02.222668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.222644 2576 generic.go:358] "Generic (PLEG): container finished" podID="514becdf-1310-4620-9c42-c349a3b1359a" containerID="eb4cfc504c3255ccc227dd8585601595c6f4ca2477694b8f17e632ae039a7a1f" exitCode=0 Apr 21 16:11:02.222756 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.222723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" event={"ID":"514becdf-1310-4620-9c42-c349a3b1359a","Type":"ContainerDied","Data":"eb4cfc504c3255ccc227dd8585601595c6f4ca2477694b8f17e632ae039a7a1f"} Apr 21 16:11:02.222810 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.222757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" event={"ID":"514becdf-1310-4620-9c42-c349a3b1359a","Type":"ContainerStarted","Data":"2cb1700732452f607fcd66696c92efdb6c23f4f8298c456537573a4717ab8ddc"} Apr 21 16:11:02.224398 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.224373 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerID="c02bcb36307ac743a5dcd9d7cbd5a107ce28132f0cf7a5afa04e690e2df22b3a" exitCode=0 Apr 21 16:11:02.224498 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.224409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" event={"ID":"cb431080-b815-441e-ae40-a7cbb06c00e8","Type":"ContainerDied","Data":"c02bcb36307ac743a5dcd9d7cbd5a107ce28132f0cf7a5afa04e690e2df22b3a"} Apr 21 16:11:02.226032 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.226013 2576 generic.go:358] "Generic (PLEG): container finished" podID="b187f500-8a83-41c6-8672-8216c305049d" containerID="5fdb7ce290d8a0c5b8d5f3a3e002d031390c0bc68e97016196ae02c83c7b60dc" exitCode=0 Apr 21 16:11:02.226109 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:02.226071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" event={"ID":"b187f500-8a83-41c6-8672-8216c305049d","Type":"ContainerDied","Data":"5fdb7ce290d8a0c5b8d5f3a3e002d031390c0bc68e97016196ae02c83c7b60dc"} Apr 21 16:11:03.235959 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.235877 2576 generic.go:358] "Generic (PLEG): container finished" podID="3bd6de24-3b32-4014-828d-44abd752467b" containerID="aec7a0d91c9b881814023051d1dc76a7a72d8fdf684fd5009afc4a849ca5085d" exitCode=0 Apr 21 16:11:03.236339 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.235950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" event={"ID":"3bd6de24-3b32-4014-828d-44abd752467b","Type":"ContainerDied","Data":"aec7a0d91c9b881814023051d1dc76a7a72d8fdf684fd5009afc4a849ca5085d"} Apr 21 16:11:03.237552 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.237514 2576 generic.go:358] "Generic (PLEG): container finished" podID="514becdf-1310-4620-9c42-c349a3b1359a" containerID="fcf8097dcdabfb80245215aea03bc7129e38b7d117e836c7bfd1ed2cc0bcf1ef" exitCode=0 Apr 21 16:11:03.237679 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.237600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" event={"ID":"514becdf-1310-4620-9c42-c349a3b1359a","Type":"ContainerDied","Data":"fcf8097dcdabfb80245215aea03bc7129e38b7d117e836c7bfd1ed2cc0bcf1ef"} Apr 21 16:11:03.239606 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.239585 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerID="846a677099678bb035992090bb602b6948fcaa30a7c4de80baf49ec176c2be11" exitCode=0 Apr 21 16:11:03.239712 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.239663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" event={"ID":"cb431080-b815-441e-ae40-a7cbb06c00e8","Type":"ContainerDied","Data":"846a677099678bb035992090bb602b6948fcaa30a7c4de80baf49ec176c2be11"} Apr 21 16:11:03.241592 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.241569 2576 generic.go:358] "Generic (PLEG): container finished" podID="b187f500-8a83-41c6-8672-8216c305049d" containerID="68adebf4c12bbcbe42529c979c19e7d707f0de0a4374c6e039d5aa91627c66c1" exitCode=0 Apr 21 16:11:03.241684 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:03.241646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" event={"ID":"b187f500-8a83-41c6-8672-8216c305049d","Type":"ContainerDied","Data":"68adebf4c12bbcbe42529c979c19e7d707f0de0a4374c6e039d5aa91627c66c1"} Apr 21 16:11:04.247109 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.247074 2576 generic.go:358] "Generic (PLEG): container finished" podID="514becdf-1310-4620-9c42-c349a3b1359a" containerID="bbaa1b67f80a5d6d826cddfdc2656fb1d64b4e6d2e5d5fcea3ee4256feb5eb4f" exitCode=0 Apr 21 16:11:04.252771 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.252722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" event={"ID":"514becdf-1310-4620-9c42-c349a3b1359a","Type":"ContainerDied","Data":"bbaa1b67f80a5d6d826cddfdc2656fb1d64b4e6d2e5d5fcea3ee4256feb5eb4f"} Apr 21 16:11:04.393130 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.393103 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:11:04.436636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.436610 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:11:04.440191 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.440169 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:04.490435 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490401 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-util\") pod \"3bd6de24-3b32-4014-828d-44abd752467b\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " Apr 21 16:11:04.490629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490441 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-bundle\") pod \"3bd6de24-3b32-4014-828d-44abd752467b\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " Apr 21 16:11:04.490629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls2mm\" (UniqueName: \"kubernetes.io/projected/b187f500-8a83-41c6-8672-8216c305049d-kube-api-access-ls2mm\") pod \"b187f500-8a83-41c6-8672-8216c305049d\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " Apr 21 16:11:04.490629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490527 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-util\") pod \"b187f500-8a83-41c6-8672-8216c305049d\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " Apr 21 16:11:04.490629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490565 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkw99\" (UniqueName: \"kubernetes.io/projected/3bd6de24-3b32-4014-828d-44abd752467b-kube-api-access-bkw99\") pod \"3bd6de24-3b32-4014-828d-44abd752467b\" (UID: \"3bd6de24-3b32-4014-828d-44abd752467b\") " Apr 21 16:11:04.490629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490588 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvscb\" (UniqueName: \"kubernetes.io/projected/cb431080-b815-441e-ae40-a7cbb06c00e8-kube-api-access-qvscb\") pod \"cb431080-b815-441e-ae40-a7cbb06c00e8\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " Apr 21 16:11:04.490910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-bundle\") pod \"cb431080-b815-441e-ae40-a7cbb06c00e8\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " Apr 21 16:11:04.490910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-util\") pod \"cb431080-b815-441e-ae40-a7cbb06c00e8\" (UID: \"cb431080-b815-441e-ae40-a7cbb06c00e8\") " Apr 21 16:11:04.490910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.490748 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-bundle\") pod \"b187f500-8a83-41c6-8672-8216c305049d\" (UID: \"b187f500-8a83-41c6-8672-8216c305049d\") " Apr 21 16:11:04.491569 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.491395 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-bundle" (OuterVolumeSpecName: "bundle") pod "3bd6de24-3b32-4014-828d-44abd752467b" (UID: "3bd6de24-3b32-4014-828d-44abd752467b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:04.491569 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.491394 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-bundle" (OuterVolumeSpecName: "bundle") pod "cb431080-b815-441e-ae40-a7cbb06c00e8" (UID: "cb431080-b815-441e-ae40-a7cbb06c00e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:04.491912 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.491890 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-bundle" (OuterVolumeSpecName: "bundle") pod "b187f500-8a83-41c6-8672-8216c305049d" (UID: "b187f500-8a83-41c6-8672-8216c305049d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:04.492840 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.492809 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd6de24-3b32-4014-828d-44abd752467b-kube-api-access-bkw99" (OuterVolumeSpecName: "kube-api-access-bkw99") pod "3bd6de24-3b32-4014-828d-44abd752467b" (UID: "3bd6de24-3b32-4014-828d-44abd752467b"). InnerVolumeSpecName "kube-api-access-bkw99". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:11:04.493033 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.493012 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b187f500-8a83-41c6-8672-8216c305049d-kube-api-access-ls2mm" (OuterVolumeSpecName: "kube-api-access-ls2mm") pod "b187f500-8a83-41c6-8672-8216c305049d" (UID: "b187f500-8a83-41c6-8672-8216c305049d"). InnerVolumeSpecName "kube-api-access-ls2mm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:11:04.493800 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.493776 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb431080-b815-441e-ae40-a7cbb06c00e8-kube-api-access-qvscb" (OuterVolumeSpecName: "kube-api-access-qvscb") pod "cb431080-b815-441e-ae40-a7cbb06c00e8" (UID: "cb431080-b815-441e-ae40-a7cbb06c00e8"). InnerVolumeSpecName "kube-api-access-qvscb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:11:04.496723 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.496698 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-util" (OuterVolumeSpecName: "util") pod "cb431080-b815-441e-ae40-a7cbb06c00e8" (UID: "cb431080-b815-441e-ae40-a7cbb06c00e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:04.497196 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.497154 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-util" (OuterVolumeSpecName: "util") pod "b187f500-8a83-41c6-8672-8216c305049d" (UID: "b187f500-8a83-41c6-8672-8216c305049d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:04.497245 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.497201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-util" (OuterVolumeSpecName: "util") pod "3bd6de24-3b32-4014-828d-44abd752467b" (UID: "3bd6de24-3b32-4014-828d-44abd752467b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:04.592017 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.591976 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ls2mm\" (UniqueName: \"kubernetes.io/projected/b187f500-8a83-41c6-8672-8216c305049d-kube-api-access-ls2mm\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592017 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592010 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592017 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592020 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkw99\" (UniqueName: \"kubernetes.io/projected/3bd6de24-3b32-4014-828d-44abd752467b-kube-api-access-bkw99\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592031 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvscb\" (UniqueName: \"kubernetes.io/projected/cb431080-b815-441e-ae40-a7cbb06c00e8-kube-api-access-qvscb\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592041 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592049 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb431080-b815-441e-ae40-a7cbb06c00e8-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592057 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b187f500-8a83-41c6-8672-8216c305049d-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592064 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:04.592260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:04.592072 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd6de24-3b32-4014-828d-44abd752467b-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:05.252090 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.252062 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" Apr 21 16:11:05.252090 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.252079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9" event={"ID":"3bd6de24-3b32-4014-828d-44abd752467b","Type":"ContainerDied","Data":"4adaabba6effade327070fdaee4a52bce82658d60811487a3586152c84354d3c"} Apr 21 16:11:05.252562 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.252108 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4adaabba6effade327070fdaee4a52bce82658d60811487a3586152c84354d3c" Apr 21 16:11:05.253936 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.253913 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" Apr 21 16:11:05.254057 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.253912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p" event={"ID":"cb431080-b815-441e-ae40-a7cbb06c00e8","Type":"ContainerDied","Data":"c107ac495f85bcb371369d632ee2ad23aedf15aa8ee03f81a7266a8c0733f750"} Apr 21 16:11:05.254057 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.254014 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c107ac495f85bcb371369d632ee2ad23aedf15aa8ee03f81a7266a8c0733f750" Apr 21 16:11:05.255653 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.255629 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" Apr 21 16:11:05.255653 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.255650 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5" event={"ID":"b187f500-8a83-41c6-8672-8216c305049d","Type":"ContainerDied","Data":"58a5b35285f72d2d97f77e31de019474ad1541a35330717ca3d21193b5c3b121"} Apr 21 16:11:05.255828 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.255671 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58a5b35285f72d2d97f77e31de019474ad1541a35330717ca3d21193b5c3b121" Apr 21 16:11:05.363276 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.363256 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:05.501089 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.501060 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-util\") pod \"514becdf-1310-4620-9c42-c349a3b1359a\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " Apr 21 16:11:05.501089 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.501091 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km7xr\" (UniqueName: \"kubernetes.io/projected/514becdf-1310-4620-9c42-c349a3b1359a-kube-api-access-km7xr\") pod \"514becdf-1310-4620-9c42-c349a3b1359a\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " Apr 21 16:11:05.501278 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.501136 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-bundle\") pod \"514becdf-1310-4620-9c42-c349a3b1359a\" (UID: \"514becdf-1310-4620-9c42-c349a3b1359a\") " Apr 21 16:11:05.501682 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.501653 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-bundle" (OuterVolumeSpecName: "bundle") pod "514becdf-1310-4620-9c42-c349a3b1359a" (UID: "514becdf-1310-4620-9c42-c349a3b1359a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:05.503256 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.503206 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514becdf-1310-4620-9c42-c349a3b1359a-kube-api-access-km7xr" (OuterVolumeSpecName: "kube-api-access-km7xr") pod "514becdf-1310-4620-9c42-c349a3b1359a" (UID: "514becdf-1310-4620-9c42-c349a3b1359a"). InnerVolumeSpecName "kube-api-access-km7xr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:11:05.508949 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.508918 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-util" (OuterVolumeSpecName: "util") pod "514becdf-1310-4620-9c42-c349a3b1359a" (UID: "514becdf-1310-4620-9c42-c349a3b1359a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:05.602519 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.602493 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:05.602519 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.602517 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-km7xr\" (UniqueName: \"kubernetes.io/projected/514becdf-1310-4620-9c42-c349a3b1359a-kube-api-access-km7xr\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:05.602648 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:05.602533 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/514becdf-1310-4620-9c42-c349a3b1359a-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:06.261068 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.261039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" event={"ID":"514becdf-1310-4620-9c42-c349a3b1359a","Type":"ContainerDied","Data":"2cb1700732452f607fcd66696c92efdb6c23f4f8298c456537573a4717ab8ddc"} Apr 21 16:11:06.261068 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.261070 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cb1700732452f607fcd66696c92efdb6c23f4f8298c456537573a4717ab8ddc" Apr 21 16:11:06.261417 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.261112 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9" Apr 21 16:11:06.933639 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933605 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b95999c84-9vhn4"] Apr 21 16:11:06.933960 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933947 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bd6de24-3b32-4014-828d-44abd752467b" containerName="pull" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933961 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd6de24-3b32-4014-828d-44abd752467b" containerName="pull" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933969 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerName="extract" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933974 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerName="extract" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933984 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerName="pull" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933989 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerName="pull" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.933996 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bd6de24-3b32-4014-828d-44abd752467b" containerName="extract" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934002 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd6de24-3b32-4014-828d-44abd752467b" containerName="extract" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934010 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="514becdf-1310-4620-9c42-c349a3b1359a" containerName="extract" Apr 21 16:11:06.934012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934015 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="514becdf-1310-4620-9c42-c349a3b1359a" containerName="extract" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934026 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b187f500-8a83-41c6-8672-8216c305049d" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934030 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b187f500-8a83-41c6-8672-8216c305049d" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934037 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934042 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934047 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b187f500-8a83-41c6-8672-8216c305049d" containerName="extract" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934052 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b187f500-8a83-41c6-8672-8216c305049d" containerName="extract" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934058 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="514becdf-1310-4620-9c42-c349a3b1359a" containerName="pull" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934063 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="514becdf-1310-4620-9c42-c349a3b1359a" containerName="pull" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934070 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bd6de24-3b32-4014-828d-44abd752467b" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934075 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd6de24-3b32-4014-828d-44abd752467b" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934081 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="514becdf-1310-4620-9c42-c349a3b1359a" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934086 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="514becdf-1310-4620-9c42-c349a3b1359a" containerName="util" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934091 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b187f500-8a83-41c6-8672-8216c305049d" containerName="pull" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934095 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b187f500-8a83-41c6-8672-8216c305049d" containerName="pull" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934154 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b187f500-8a83-41c6-8672-8216c305049d" containerName="extract" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934162 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb431080-b815-441e-ae40-a7cbb06c00e8" containerName="extract" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934168 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bd6de24-3b32-4014-828d-44abd752467b" containerName="extract" Apr 21 16:11:06.934269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.934176 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="514becdf-1310-4620-9c42-c349a3b1359a" containerName="extract" Apr 21 16:11:06.937752 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.937733 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:06.947523 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:06.947497 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b95999c84-9vhn4"] Apr 21 16:11:07.012743 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.012723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-oauth-config\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.012851 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.012753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-config\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.012851 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.012816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-serving-cert\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.012938 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.012853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-service-ca\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.012938 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.012922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-oauth-serving-cert\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.013000 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.012945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-trusted-ca-bundle\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.013000 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.012961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csw4l\" (UniqueName: \"kubernetes.io/projected/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-kube-api-access-csw4l\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.113614 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.113593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-service-ca\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.113707 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.113637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-oauth-serving-cert\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.113707 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.113666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-trusted-ca-bundle\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.113877 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.113837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csw4l\" (UniqueName: \"kubernetes.io/projected/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-kube-api-access-csw4l\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.113928 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.113902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-oauth-config\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.113967 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.113934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-config\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.114018 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.113991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-serving-cert\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.114364 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.114331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-service-ca\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.114455 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.114390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-oauth-serving-cert\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.114528 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.114515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-trusted-ca-bundle\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.114610 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.114591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-config\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.116291 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.116267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-oauth-config\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.116532 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.116513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-console-serving-cert\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.122131 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.122109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csw4l\" (UniqueName: \"kubernetes.io/projected/20353ddb-2c96-48a6-a3df-7d1daa3ad80f-kube-api-access-csw4l\") pod \"console-7b95999c84-9vhn4\" (UID: \"20353ddb-2c96-48a6-a3df-7d1daa3ad80f\") " pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.248647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.248583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:07.370748 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:07.370722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b95999c84-9vhn4"] Apr 21 16:11:07.373215 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:11:07.373184 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20353ddb_2c96_48a6_a3df_7d1daa3ad80f.slice/crio-2d7cac34a165084222dff18efb0885beefd0b20f1b4e2f3f035e286d446e69e6 WatchSource:0}: Error finding container 2d7cac34a165084222dff18efb0885beefd0b20f1b4e2f3f035e286d446e69e6: Status 404 returned error can't find the container with id 2d7cac34a165084222dff18efb0885beefd0b20f1b4e2f3f035e286d446e69e6 Apr 21 16:11:08.271129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:08.271093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b95999c84-9vhn4" event={"ID":"20353ddb-2c96-48a6-a3df-7d1daa3ad80f","Type":"ContainerStarted","Data":"4d50745778041255cd450355ff0e40c926074b2ec59d75169c8871a2099fbe0d"} Apr 21 16:11:08.271129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:08.271129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b95999c84-9vhn4" event={"ID":"20353ddb-2c96-48a6-a3df-7d1daa3ad80f","Type":"ContainerStarted","Data":"2d7cac34a165084222dff18efb0885beefd0b20f1b4e2f3f035e286d446e69e6"} Apr 21 16:11:08.290411 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:08.290359 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b95999c84-9vhn4" podStartSLOduration=2.290344728 podStartE2EDuration="2.290344728s" podCreationTimestamp="2026-04-21 16:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:11:08.287461441 +0000 UTC m=+556.663643254" watchObservedRunningTime="2026-04-21 16:11:08.290344728 +0000 UTC m=+556.666526538" Apr 21 16:11:15.283989 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.283953 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c"] Apr 21 16:11:15.286292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.286275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" Apr 21 16:11:15.290511 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.290485 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-46flm\"" Apr 21 16:11:15.290616 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.290600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 16:11:15.333665 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.333622 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c"] Apr 21 16:11:15.382141 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.382118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl6rt\" (UniqueName: \"kubernetes.io/projected/76de666e-091c-4605-a1b1-836dd44b381b-kube-api-access-jl6rt\") pod \"dns-operator-controller-manager-648d5c98bc-krh7c\" (UID: \"76de666e-091c-4605-a1b1-836dd44b381b\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" Apr 21 16:11:15.482508 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.482485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl6rt\" (UniqueName: \"kubernetes.io/projected/76de666e-091c-4605-a1b1-836dd44b381b-kube-api-access-jl6rt\") pod \"dns-operator-controller-manager-648d5c98bc-krh7c\" (UID: \"76de666e-091c-4605-a1b1-836dd44b381b\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" Apr 21 16:11:15.505502 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.505480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl6rt\" (UniqueName: \"kubernetes.io/projected/76de666e-091c-4605-a1b1-836dd44b381b-kube-api-access-jl6rt\") pod \"dns-operator-controller-manager-648d5c98bc-krh7c\" (UID: \"76de666e-091c-4605-a1b1-836dd44b381b\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" Apr 21 16:11:15.596865 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.596803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" Apr 21 16:11:15.720145 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:15.720115 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c"] Apr 21 16:11:15.721238 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:11:15.721206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76de666e_091c_4605_a1b1_836dd44b381b.slice/crio-453502de2b7a06ddb53574f13f17f95c142bbf6f35498add89930a971eabd0f3 WatchSource:0}: Error finding container 453502de2b7a06ddb53574f13f17f95c142bbf6f35498add89930a971eabd0f3: Status 404 returned error can't find the container with id 453502de2b7a06ddb53574f13f17f95c142bbf6f35498add89930a971eabd0f3 Apr 21 16:11:16.309601 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:16.309567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" event={"ID":"76de666e-091c-4605-a1b1-836dd44b381b","Type":"ContainerStarted","Data":"453502de2b7a06ddb53574f13f17f95c142bbf6f35498add89930a971eabd0f3"} Apr 21 16:11:17.249292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:17.249251 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:17.249292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:17.249299 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:17.254075 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:17.254051 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:17.324903 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:17.321989 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b95999c84-9vhn4" Apr 21 16:11:17.386780 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:17.386748 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579bfd59c8-j698z"] Apr 21 16:11:18.764736 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:18.764697 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd"] Apr 21 16:11:18.767112 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:18.767096 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:18.769670 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:18.769648 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-4v59m\"" Apr 21 16:11:18.777408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:18.777387 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd"] Apr 21 16:11:18.913770 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:18.913735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrpt9\" (UniqueName: \"kubernetes.io/projected/03f01d4d-6722-45bc-9464-77d7cd1acf98-kube-api-access-xrpt9\") pod \"limitador-operator-controller-manager-85c4996f8c-7rddd\" (UID: \"03f01d4d-6722-45bc-9464-77d7cd1acf98\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:19.015157 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.015123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrpt9\" (UniqueName: \"kubernetes.io/projected/03f01d4d-6722-45bc-9464-77d7cd1acf98-kube-api-access-xrpt9\") pod \"limitador-operator-controller-manager-85c4996f8c-7rddd\" (UID: \"03f01d4d-6722-45bc-9464-77d7cd1acf98\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:19.026083 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.026058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrpt9\" (UniqueName: \"kubernetes.io/projected/03f01d4d-6722-45bc-9464-77d7cd1acf98-kube-api-access-xrpt9\") pod \"limitador-operator-controller-manager-85c4996f8c-7rddd\" (UID: \"03f01d4d-6722-45bc-9464-77d7cd1acf98\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:19.077751 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.077725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:19.214931 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.214833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd"] Apr 21 16:11:19.218547 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:11:19.218518 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03f01d4d_6722_45bc_9464_77d7cd1acf98.slice/crio-a4cbcf248281dae1ab997879e6039c4a775e66efacfd36773126175f041fabf1 WatchSource:0}: Error finding container a4cbcf248281dae1ab997879e6039c4a775e66efacfd36773126175f041fabf1: Status 404 returned error can't find the container with id a4cbcf248281dae1ab997879e6039c4a775e66efacfd36773126175f041fabf1 Apr 21 16:11:19.330558 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.330518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" event={"ID":"76de666e-091c-4605-a1b1-836dd44b381b","Type":"ContainerStarted","Data":"0875b96ff52df4c9aefdfcce36810689a9b81609df940f318ddad6602fbe679b"} Apr 21 16:11:19.330701 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.330661 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" Apr 21 16:11:19.331662 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.331642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" event={"ID":"03f01d4d-6722-45bc-9464-77d7cd1acf98","Type":"ContainerStarted","Data":"a4cbcf248281dae1ab997879e6039c4a775e66efacfd36773126175f041fabf1"} Apr 21 16:11:19.355571 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:19.355524 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" podStartSLOduration=1.525877805 podStartE2EDuration="4.355509678s" podCreationTimestamp="2026-04-21 16:11:15 +0000 UTC" firstStartedPulling="2026-04-21 16:11:15.723155611 +0000 UTC m=+564.099337401" lastFinishedPulling="2026-04-21 16:11:18.552787475 +0000 UTC m=+566.928969274" observedRunningTime="2026-04-21 16:11:19.353373393 +0000 UTC m=+567.729555204" watchObservedRunningTime="2026-04-21 16:11:19.355509678 +0000 UTC m=+567.731691489" Apr 21 16:11:21.340821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:21.340790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" event={"ID":"03f01d4d-6722-45bc-9464-77d7cd1acf98","Type":"ContainerStarted","Data":"e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b"} Apr 21 16:11:21.341124 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:21.340839 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:21.359303 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:21.359260 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" podStartSLOduration=1.313434256 podStartE2EDuration="3.359247566s" podCreationTimestamp="2026-04-21 16:11:18 +0000 UTC" firstStartedPulling="2026-04-21 16:11:19.220624726 +0000 UTC m=+567.596806514" lastFinishedPulling="2026-04-21 16:11:21.266438031 +0000 UTC m=+569.642619824" observedRunningTime="2026-04-21 16:11:21.356872045 +0000 UTC m=+569.733053846" watchObservedRunningTime="2026-04-21 16:11:21.359247566 +0000 UTC m=+569.735429378" Apr 21 16:11:27.636120 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.636090 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24"] Apr 21 16:11:27.638919 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.638903 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:27.642119 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.642103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-x4qcz\"" Apr 21 16:11:27.659417 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.659383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24"] Apr 21 16:11:27.786499 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.786472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skf2g\" (UniqueName: \"kubernetes.io/projected/d4643c17-49d5-40df-8df0-91aa3933c7da-kube-api-access-skf2g\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:27.786616 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.786535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4643c17-49d5-40df-8df0-91aa3933c7da-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:27.887755 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.887673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skf2g\" (UniqueName: \"kubernetes.io/projected/d4643c17-49d5-40df-8df0-91aa3933c7da-kube-api-access-skf2g\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:27.887755 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.887754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4643c17-49d5-40df-8df0-91aa3933c7da-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:27.888098 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.888080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4643c17-49d5-40df-8df0-91aa3933c7da-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:27.899319 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.899297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skf2g\" (UniqueName: \"kubernetes.io/projected/d4643c17-49d5-40df-8df0-91aa3933c7da-kube-api-access-skf2g\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:27.949454 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:27.949428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:28.098060 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:11:28.098018 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4643c17_49d5_40df_8df0_91aa3933c7da.slice/crio-60e947529af7b1b6b7c0f584b1bcf679d017dbc0ecf861cd74511a3c0908a2fc WatchSource:0}: Error finding container 60e947529af7b1b6b7c0f584b1bcf679d017dbc0ecf861cd74511a3c0908a2fc: Status 404 returned error can't find the container with id 60e947529af7b1b6b7c0f584b1bcf679d017dbc0ecf861cd74511a3c0908a2fc Apr 21 16:11:28.099955 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:28.099930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24"] Apr 21 16:11:28.371550 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:28.371518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" event={"ID":"d4643c17-49d5-40df-8df0-91aa3933c7da","Type":"ContainerStarted","Data":"60e947529af7b1b6b7c0f584b1bcf679d017dbc0ecf861cd74511a3c0908a2fc"} Apr 21 16:11:30.337921 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:30.337891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-krh7c" Apr 21 16:11:32.347112 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:32.347080 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:35.404012 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:35.403975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" event={"ID":"d4643c17-49d5-40df-8df0-91aa3933c7da","Type":"ContainerStarted","Data":"1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05"} Apr 21 16:11:35.404383 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:35.404052 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:35.431310 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:35.431264 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" podStartSLOduration=1.462727805 podStartE2EDuration="8.431251898s" podCreationTimestamp="2026-04-21 16:11:27 +0000 UTC" firstStartedPulling="2026-04-21 16:11:28.100833474 +0000 UTC m=+576.477015267" lastFinishedPulling="2026-04-21 16:11:35.069357567 +0000 UTC m=+583.445539360" observedRunningTime="2026-04-21 16:11:35.42835475 +0000 UTC m=+583.804536572" watchObservedRunningTime="2026-04-21 16:11:35.431251898 +0000 UTC m=+583.807433710" Apr 21 16:11:42.409513 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.409471 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-579bfd59c8-j698z" podUID="e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" containerName="console" containerID="cri-o://74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14" gracePeriod=15 Apr 21 16:11:42.648660 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.648640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579bfd59c8-j698z_e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a/console/0.log" Apr 21 16:11:42.648767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.648696 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:11:42.707260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707196 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frtnh\" (UniqueName: \"kubernetes.io/projected/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-kube-api-access-frtnh\") pod \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " Apr 21 16:11:42.707260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707231 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-oauth-serving-cert\") pod \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " Apr 21 16:11:42.707260 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707259 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-oauth-config\") pod \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " Apr 21 16:11:42.707482 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707283 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-trusted-ca-bundle\") pod \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " Apr 21 16:11:42.707482 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707302 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-config\") pod \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " Apr 21 16:11:42.707482 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707327 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-service-ca\") pod \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " Apr 21 16:11:42.707482 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707342 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-serving-cert\") pod \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\" (UID: \"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a\") " Apr 21 16:11:42.707688 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" (UID: "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:11:42.707841 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707809 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-service-ca" (OuterVolumeSpecName: "service-ca") pod "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" (UID: "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:11:42.707985 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707876 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" (UID: "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:11:42.707985 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.707946 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-config" (OuterVolumeSpecName: "console-config") pod "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" (UID: "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:11:42.709505 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.709475 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" (UID: "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:11:42.709628 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.709607 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" (UID: "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:11:42.709676 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.709639 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-kube-api-access-frtnh" (OuterVolumeSpecName: "kube-api-access-frtnh") pod "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" (UID: "e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a"). InnerVolumeSpecName "kube-api-access-frtnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:11:42.808847 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.808821 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frtnh\" (UniqueName: \"kubernetes.io/projected/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-kube-api-access-frtnh\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:42.808847 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.808846 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-oauth-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:42.809011 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.808884 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-oauth-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:42.809011 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.808898 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-trusted-ca-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:42.809011 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.808911 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:42.809011 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.808924 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-service-ca\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:42.809011 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:42.808935 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a-console-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:43.441477 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.441448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579bfd59c8-j698z_e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a/console/0.log" Apr 21 16:11:43.441979 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.441488 2576 generic.go:358] "Generic (PLEG): container finished" podID="e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" containerID="74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14" exitCode=2 Apr 21 16:11:43.441979 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.441531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579bfd59c8-j698z" event={"ID":"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a","Type":"ContainerDied","Data":"74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14"} Apr 21 16:11:43.441979 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.441579 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579bfd59c8-j698z" Apr 21 16:11:43.441979 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.441598 2576 scope.go:117] "RemoveContainer" containerID="74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14" Apr 21 16:11:43.441979 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.441584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579bfd59c8-j698z" event={"ID":"e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a","Type":"ContainerDied","Data":"5afa2fea425a73130c040c72c2b49b28c5e3a6e6b40f6f2c0ae5d80ff3d8a248"} Apr 21 16:11:43.450996 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.450983 2576 scope.go:117] "RemoveContainer" containerID="74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14" Apr 21 16:11:43.451240 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:11:43.451223 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14\": container with ID starting with 74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14 not found: ID does not exist" containerID="74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14" Apr 21 16:11:43.451293 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.451246 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14"} err="failed to get container status \"74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14\": rpc error: code = NotFound desc = could not find container \"74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14\": container with ID starting with 74ee32a2028bf32f9c739ca4d9178b91c2b5b2cce82ea6b8aab712377827db14 not found: ID does not exist" Apr 21 16:11:43.468960 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.468939 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579bfd59c8-j698z"] Apr 21 16:11:43.475946 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:43.475924 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-579bfd59c8-j698z"] Apr 21 16:11:44.252835 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:44.252799 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" path="/var/lib/kubelet/pods/e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a/volumes" Apr 21 16:11:46.416279 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:46.416243 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:48.024003 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.023971 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24"] Apr 21 16:11:48.024440 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.024190 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" containerName="manager" containerID="cri-o://1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05" gracePeriod=2 Apr 21 16:11:48.030835 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.030805 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24"] Apr 21 16:11:48.062540 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.062513 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd"] Apr 21 16:11:48.063076 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.062885 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" containerName="manager" containerID="cri-o://e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b" gracePeriod=2 Apr 21 16:11:48.065518 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.065485 2576 status_manager.go:895] "Failed to get status for pod" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" is forbidden: User \"system:node:ip-10-0-138-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-191.ec2.internal' and this object" Apr 21 16:11:48.065779 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.065751 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd"] Apr 21 16:11:48.068792 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.068760 2576 status_manager.go:895] "Failed to get status for pod" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" err="pods \"limitador-operator-controller-manager-85c4996f8c-7rddd\" is forbidden: User \"system:node:ip-10-0-138-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-191.ec2.internal' and this object" Apr 21 16:11:48.079233 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079210 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn"] Apr 21 16:11:48.079754 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079740 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" containerName="console" Apr 21 16:11:48.079800 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079758 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" containerName="console" Apr 21 16:11:48.079800 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079795 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" containerName="manager" Apr 21 16:11:48.079872 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079804 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" containerName="manager" Apr 21 16:11:48.079872 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079816 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" containerName="manager" Apr 21 16:11:48.079872 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079824 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" containerName="manager" Apr 21 16:11:48.079962 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079926 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e989038b-2f0d-4dc9-b6d3-fd56e2c32c7a" containerName="console" Apr 21 16:11:48.079962 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079942 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" containerName="manager" Apr 21 16:11:48.079962 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.079954 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" containerName="manager" Apr 21 16:11:48.082572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.082552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.101235 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.100551 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7"] Apr 21 16:11:48.103541 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.103523 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn"] Apr 21 16:11:48.103652 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.103638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" Apr 21 16:11:48.126496 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.126470 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7"] Apr 21 16:11:48.144419 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.144388 2576 status_manager.go:895] "Failed to get status for pod" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" is forbidden: User \"system:node:ip-10-0-138-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-191.ec2.internal' and this object" Apr 21 16:11:48.146506 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.146478 2576 status_manager.go:895] "Failed to get status for pod" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" err="pods \"limitador-operator-controller-manager-85c4996f8c-7rddd\" is forbidden: User \"system:node:ip-10-0-138-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-191.ec2.internal' and this object" Apr 21 16:11:48.148693 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.148668 2576 status_manager.go:895] "Failed to get status for pod" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8lj24\" is forbidden: User \"system:node:ip-10-0-138-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-191.ec2.internal' and this object" Apr 21 16:11:48.148800 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.148722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxbgl\" (UniqueName: \"kubernetes.io/projected/7c5e7d02-3e1c-4afb-921c-06c00262a694-kube-api-access-qxbgl\") pod \"limitador-operator-controller-manager-85c4996f8c-ldfw7\" (UID: \"7c5e7d02-3e1c-4afb-921c-06c00262a694\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" Apr 21 16:11:48.148800 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.148761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c07586-5321-4708-b908-e7150e5bb613-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ljpbn\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.148943 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.148838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4r72\" (UniqueName: \"kubernetes.io/projected/e4c07586-5321-4708-b908-e7150e5bb613-kube-api-access-w4r72\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ljpbn\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.176362 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.176328 2576 status_manager.go:895] "Failed to get status for pod" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" err="pods \"limitador-operator-controller-manager-85c4996f8c-7rddd\" is forbidden: User \"system:node:ip-10-0-138-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-191.ec2.internal' and this object" Apr 21 16:11:48.249464 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.249423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxbgl\" (UniqueName: \"kubernetes.io/projected/7c5e7d02-3e1c-4afb-921c-06c00262a694-kube-api-access-qxbgl\") pod \"limitador-operator-controller-manager-85c4996f8c-ldfw7\" (UID: \"7c5e7d02-3e1c-4afb-921c-06c00262a694\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" Apr 21 16:11:48.249600 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.249536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c07586-5321-4708-b908-e7150e5bb613-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ljpbn\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.249661 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.249605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4r72\" (UniqueName: \"kubernetes.io/projected/e4c07586-5321-4708-b908-e7150e5bb613-kube-api-access-w4r72\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ljpbn\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.250003 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.249978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c07586-5321-4708-b908-e7150e5bb613-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ljpbn\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.260848 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.260821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4r72\" (UniqueName: \"kubernetes.io/projected/e4c07586-5321-4708-b908-e7150e5bb613-kube-api-access-w4r72\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ljpbn\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.261009 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.260986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxbgl\" (UniqueName: \"kubernetes.io/projected/7c5e7d02-3e1c-4afb-921c-06c00262a694-kube-api-access-qxbgl\") pod \"limitador-operator-controller-manager-85c4996f8c-ldfw7\" (UID: \"7c5e7d02-3e1c-4afb-921c-06c00262a694\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" Apr 21 16:11:48.292371 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.292349 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:48.295416 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.295399 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:48.350788 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.350766 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrpt9\" (UniqueName: \"kubernetes.io/projected/03f01d4d-6722-45bc-9464-77d7cd1acf98-kube-api-access-xrpt9\") pod \"03f01d4d-6722-45bc-9464-77d7cd1acf98\" (UID: \"03f01d4d-6722-45bc-9464-77d7cd1acf98\") " Apr 21 16:11:48.350903 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.350851 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skf2g\" (UniqueName: \"kubernetes.io/projected/d4643c17-49d5-40df-8df0-91aa3933c7da-kube-api-access-skf2g\") pod \"d4643c17-49d5-40df-8df0-91aa3933c7da\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " Apr 21 16:11:48.350981 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.350966 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4643c17-49d5-40df-8df0-91aa3933c7da-extensions-socket-volume\") pod \"d4643c17-49d5-40df-8df0-91aa3933c7da\" (UID: \"d4643c17-49d5-40df-8df0-91aa3933c7da\") " Apr 21 16:11:48.351519 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.351491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4643c17-49d5-40df-8df0-91aa3933c7da-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "d4643c17-49d5-40df-8df0-91aa3933c7da" (UID: "d4643c17-49d5-40df-8df0-91aa3933c7da"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:11:48.352776 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.352756 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f01d4d-6722-45bc-9464-77d7cd1acf98-kube-api-access-xrpt9" (OuterVolumeSpecName: "kube-api-access-xrpt9") pod "03f01d4d-6722-45bc-9464-77d7cd1acf98" (UID: "03f01d4d-6722-45bc-9464-77d7cd1acf98"). InnerVolumeSpecName "kube-api-access-xrpt9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:11:48.352850 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.352818 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4643c17-49d5-40df-8df0-91aa3933c7da-kube-api-access-skf2g" (OuterVolumeSpecName: "kube-api-access-skf2g") pod "d4643c17-49d5-40df-8df0-91aa3933c7da" (UID: "d4643c17-49d5-40df-8df0-91aa3933c7da"). InnerVolumeSpecName "kube-api-access-skf2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:11:48.443414 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.443390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:48.450255 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.450231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" Apr 21 16:11:48.451998 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.451979 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4643c17-49d5-40df-8df0-91aa3933c7da-extensions-socket-volume\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:48.452092 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.452001 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrpt9\" (UniqueName: \"kubernetes.io/projected/03f01d4d-6722-45bc-9464-77d7cd1acf98-kube-api-access-xrpt9\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:48.452092 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.452015 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skf2g\" (UniqueName: \"kubernetes.io/projected/d4643c17-49d5-40df-8df0-91aa3933c7da-kube-api-access-skf2g\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:11:48.464426 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.464401 2576 generic.go:358] "Generic (PLEG): container finished" podID="03f01d4d-6722-45bc-9464-77d7cd1acf98" containerID="e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b" exitCode=0 Apr 21 16:11:48.464520 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.464460 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7rddd" Apr 21 16:11:48.464520 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.464512 2576 scope.go:117] "RemoveContainer" containerID="e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b" Apr 21 16:11:48.465999 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.465963 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4643c17-49d5-40df-8df0-91aa3933c7da" containerID="1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05" exitCode=0 Apr 21 16:11:48.466136 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.466030 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8lj24" Apr 21 16:11:48.474537 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.474518 2576 scope.go:117] "RemoveContainer" containerID="e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b" Apr 21 16:11:48.474837 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:11:48.474813 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b\": container with ID starting with e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b not found: ID does not exist" containerID="e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b" Apr 21 16:11:48.474927 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.474850 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b"} err="failed to get container status \"e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b\": rpc error: code = NotFound desc = could not find container \"e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b\": container with ID starting with e5a4ce0a5a1b92a939b7b523b2d973581e724f450c757382af9b480bd552c12b not found: ID does not exist" Apr 21 16:11:48.474927 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.474892 2576 scope.go:117] "RemoveContainer" containerID="1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05" Apr 21 16:11:48.485750 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.485713 2576 scope.go:117] "RemoveContainer" containerID="1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05" Apr 21 16:11:48.486015 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:11:48.485995 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05\": container with ID starting with 1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05 not found: ID does not exist" containerID="1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05" Apr 21 16:11:48.486086 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.486021 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05"} err="failed to get container status \"1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05\": rpc error: code = NotFound desc = could not find container \"1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05\": container with ID starting with 1f24689fb06a509d07e1ee64575341f655c889dd2b454f0f279dc743fc2b3e05 not found: ID does not exist" Apr 21 16:11:48.592870 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.592825 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn"] Apr 21 16:11:48.596771 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:11:48.596740 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4c07586_5321_4708_b908_e7150e5bb613.slice/crio-18827c192df4a218d9ae0137b6e083906eea272cfa6ae9ff2f6065a9c3ea6377 WatchSource:0}: Error finding container 18827c192df4a218d9ae0137b6e083906eea272cfa6ae9ff2f6065a9c3ea6377: Status 404 returned error can't find the container with id 18827c192df4a218d9ae0137b6e083906eea272cfa6ae9ff2f6065a9c3ea6377 Apr 21 16:11:48.613096 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:48.613072 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7"] Apr 21 16:11:48.616832 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:11:48.616796 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5e7d02_3e1c_4afb_921c_06c00262a694.slice/crio-7be051704f232eebce0e7c805ffea142ee48a1c08e5c9c8f98fdbd78ce67faac WatchSource:0}: Error finding container 7be051704f232eebce0e7c805ffea142ee48a1c08e5c9c8f98fdbd78ce67faac: Status 404 returned error can't find the container with id 7be051704f232eebce0e7c805ffea142ee48a1c08e5c9c8f98fdbd78ce67faac Apr 21 16:11:49.471712 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.471632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" event={"ID":"e4c07586-5321-4708-b908-e7150e5bb613","Type":"ContainerStarted","Data":"0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a"} Apr 21 16:11:49.472196 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.471818 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:11:49.472196 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.471838 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" event={"ID":"e4c07586-5321-4708-b908-e7150e5bb613","Type":"ContainerStarted","Data":"18827c192df4a218d9ae0137b6e083906eea272cfa6ae9ff2f6065a9c3ea6377"} Apr 21 16:11:49.474751 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.474730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" event={"ID":"7c5e7d02-3e1c-4afb-921c-06c00262a694","Type":"ContainerStarted","Data":"546cfc51dac150c09b0995cebb1efef8f4538f9ca4047a600cfcd52495edf2e1"} Apr 21 16:11:49.474814 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.474756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" event={"ID":"7c5e7d02-3e1c-4afb-921c-06c00262a694","Type":"ContainerStarted","Data":"7be051704f232eebce0e7c805ffea142ee48a1c08e5c9c8f98fdbd78ce67faac"} Apr 21 16:11:49.474875 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.474850 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" Apr 21 16:11:49.503336 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.503290 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" podStartSLOduration=1.50327729 podStartE2EDuration="1.50327729s" podCreationTimestamp="2026-04-21 16:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:11:49.500835977 +0000 UTC m=+597.877017787" watchObservedRunningTime="2026-04-21 16:11:49.50327729 +0000 UTC m=+597.879459101" Apr 21 16:11:49.526920 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:49.526844 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" podStartSLOduration=1.526827929 podStartE2EDuration="1.526827929s" podCreationTimestamp="2026-04-21 16:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:11:49.525767821 +0000 UTC m=+597.901949631" watchObservedRunningTime="2026-04-21 16:11:49.526827929 +0000 UTC m=+597.903009741" Apr 21 16:11:50.257752 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:50.257717 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f01d4d-6722-45bc-9464-77d7cd1acf98" path="/var/lib/kubelet/pods/03f01d4d-6722-45bc-9464-77d7cd1acf98/volumes" Apr 21 16:11:50.258105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:50.258091 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4643c17-49d5-40df-8df0-91aa3933c7da" path="/var/lib/kubelet/pods/d4643c17-49d5-40df-8df0-91aa3933c7da/volumes" Apr 21 16:11:52.170503 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:52.170478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:11:52.170922 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:11:52.170478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:12:00.481583 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:00.481539 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ldfw7" Apr 21 16:12:00.482083 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:00.481606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:12:04.850300 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:04.850268 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn"] Apr 21 16:12:04.850663 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:04.850477 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" podUID="e4c07586-5321-4708-b908-e7150e5bb613" containerName="manager" containerID="cri-o://0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a" gracePeriod=10 Apr 21 16:12:05.094619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.094597 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:12:05.188721 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.188681 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c07586-5321-4708-b908-e7150e5bb613-extensions-socket-volume\") pod \"e4c07586-5321-4708-b908-e7150e5bb613\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " Apr 21 16:12:05.188721 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.188725 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4r72\" (UniqueName: \"kubernetes.io/projected/e4c07586-5321-4708-b908-e7150e5bb613-kube-api-access-w4r72\") pod \"e4c07586-5321-4708-b908-e7150e5bb613\" (UID: \"e4c07586-5321-4708-b908-e7150e5bb613\") " Apr 21 16:12:05.189195 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.189170 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4c07586-5321-4708-b908-e7150e5bb613-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e4c07586-5321-4708-b908-e7150e5bb613" (UID: "e4c07586-5321-4708-b908-e7150e5bb613"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:12:05.191065 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.191032 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c07586-5321-4708-b908-e7150e5bb613-kube-api-access-w4r72" (OuterVolumeSpecName: "kube-api-access-w4r72") pod "e4c07586-5321-4708-b908-e7150e5bb613" (UID: "e4c07586-5321-4708-b908-e7150e5bb613"). InnerVolumeSpecName "kube-api-access-w4r72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:12:05.290248 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.290216 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c07586-5321-4708-b908-e7150e5bb613-extensions-socket-volume\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:12:05.290248 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.290242 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4r72\" (UniqueName: \"kubernetes.io/projected/e4c07586-5321-4708-b908-e7150e5bb613-kube-api-access-w4r72\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:12:05.539657 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.539570 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4c07586-5321-4708-b908-e7150e5bb613" containerID="0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a" exitCode=0 Apr 21 16:12:05.539657 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.539635 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" Apr 21 16:12:05.539657 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.539640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" event={"ID":"e4c07586-5321-4708-b908-e7150e5bb613","Type":"ContainerDied","Data":"0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a"} Apr 21 16:12:05.539906 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.539676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn" event={"ID":"e4c07586-5321-4708-b908-e7150e5bb613","Type":"ContainerDied","Data":"18827c192df4a218d9ae0137b6e083906eea272cfa6ae9ff2f6065a9c3ea6377"} Apr 21 16:12:05.539906 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.539691 2576 scope.go:117] "RemoveContainer" containerID="0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a" Apr 21 16:12:05.550542 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.550520 2576 scope.go:117] "RemoveContainer" containerID="0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a" Apr 21 16:12:05.550799 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:12:05.550778 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a\": container with ID starting with 0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a not found: ID does not exist" containerID="0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a" Apr 21 16:12:05.550913 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.550806 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a"} err="failed to get container status \"0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a\": rpc error: code = NotFound desc = could not find container \"0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a\": container with ID starting with 0c90215ac89544153ac52f30b2784b51c33e1318592d850c602f9d0eff1fb02a not found: ID does not exist" Apr 21 16:12:05.565314 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.565292 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn"] Apr 21 16:12:05.571072 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:05.571054 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ljpbn"] Apr 21 16:12:06.252436 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:06.252405 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c07586-5321-4708-b908-e7150e5bb613" path="/var/lib/kubelet/pods/e4c07586-5321-4708-b908-e7150e5bb613/volumes" Apr 21 16:12:21.628562 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.628528 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp"] Apr 21 16:12:21.630907 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.628913 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4c07586-5321-4708-b908-e7150e5bb613" containerName="manager" Apr 21 16:12:21.630907 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.628925 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c07586-5321-4708-b908-e7150e5bb613" containerName="manager" Apr 21 16:12:21.630907 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.628987 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4c07586-5321-4708-b908-e7150e5bb613" containerName="manager" Apr 21 16:12:21.631691 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.631674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.639221 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.639205 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-96j5n\"" Apr 21 16:12:21.647962 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.647935 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp"] Apr 21 16:12:21.723327 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723456 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723456 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wcg\" (UniqueName: \"kubernetes.io/projected/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-kube-api-access-t5wcg\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723697 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723697 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.723697 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.723654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.824834 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.824809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.824995 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.824906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.824995 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.824957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.824995 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.824991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825160 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wcg\" (UniqueName: \"kubernetes.io/projected/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-kube-api-access-t5wcg\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825160 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825160 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825160 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825359 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825359 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825359 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825486 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825486 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.825744 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.825722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.827354 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.827334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.827606 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.827587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.832851 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.832824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wcg\" (UniqueName: \"kubernetes.io/projected/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-kube-api-access-t5wcg\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.833107 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.833087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3a8f8c79-05b9-4e1f-a208-dca73b68e53b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-j9mdp\" (UID: \"3a8f8c79-05b9-4e1f-a208-dca73b68e53b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:21.942087 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:21.942064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:22.083723 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.083697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp"] Apr 21 16:12:22.083823 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:12:22.083782 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a8f8c79_05b9_4e1f_a208_dca73b68e53b.slice/crio-5ead19065b56f2723f157c731ba9693a5aa884768edf3588f5d1dce38f9b1627 WatchSource:0}: Error finding container 5ead19065b56f2723f157c731ba9693a5aa884768edf3588f5d1dce38f9b1627: Status 404 returned error can't find the container with id 5ead19065b56f2723f157c731ba9693a5aa884768edf3588f5d1dce38f9b1627 Apr 21 16:12:22.085782 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.085767 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:12:22.086166 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.086141 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 16:12:22.086217 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.086197 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 16:12:22.086256 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.086228 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 16:12:22.605400 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.605362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" event={"ID":"3a8f8c79-05b9-4e1f-a208-dca73b68e53b","Type":"ContainerStarted","Data":"922f656cb9ad48dbf1177033a4cdd5245e6e2fdda85b4ade687c557e4cd8387d"} Apr 21 16:12:22.605572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.605408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" event={"ID":"3a8f8c79-05b9-4e1f-a208-dca73b68e53b","Type":"ContainerStarted","Data":"5ead19065b56f2723f157c731ba9693a5aa884768edf3588f5d1dce38f9b1627"} Apr 21 16:12:22.627466 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.627410 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" podStartSLOduration=1.62739655 podStartE2EDuration="1.62739655s" podCreationTimestamp="2026-04-21 16:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:12:22.626356309 +0000 UTC m=+631.002538112" watchObservedRunningTime="2026-04-21 16:12:22.62739655 +0000 UTC m=+631.003578361" Apr 21 16:12:22.942909 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:22.942852 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:23.947509 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:23.947475 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:24.613847 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:24.613817 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:24.614942 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:24.614915 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-j9mdp" Apr 21 16:12:25.828129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.827968 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kc6z5"] Apr 21 16:12:25.832770 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.832723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:25.835767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.835725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hpb5g\"" Apr 21 16:12:25.836013 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.835725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 16:12:25.842986 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.842933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kc6z5"] Apr 21 16:12:25.937571 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.937507 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kc6z5"] Apr 21 16:12:25.962680 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.962585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69qhb\" (UniqueName: \"kubernetes.io/projected/f9a16a79-6310-448c-aa00-b9c6ad976176-kube-api-access-69qhb\") pod \"limitador-limitador-7d549b5b-kc6z5\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:25.963078 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:25.962831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f9a16a79-6310-448c-aa00-b9c6ad976176-config-file\") pod \"limitador-limitador-7d549b5b-kc6z5\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:26.064939 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.064081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f9a16a79-6310-448c-aa00-b9c6ad976176-config-file\") pod \"limitador-limitador-7d549b5b-kc6z5\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:26.065335 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.065163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69qhb\" (UniqueName: \"kubernetes.io/projected/f9a16a79-6310-448c-aa00-b9c6ad976176-kube-api-access-69qhb\") pod \"limitador-limitador-7d549b5b-kc6z5\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:26.065436 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.065367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f9a16a79-6310-448c-aa00-b9c6ad976176-config-file\") pod \"limitador-limitador-7d549b5b-kc6z5\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:26.075928 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.075844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69qhb\" (UniqueName: \"kubernetes.io/projected/f9a16a79-6310-448c-aa00-b9c6ad976176-kube-api-access-69qhb\") pod \"limitador-limitador-7d549b5b-kc6z5\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:26.153297 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.153223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:26.352279 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.352234 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kc6z5"] Apr 21 16:12:26.355481 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:12:26.355426 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a16a79_6310_448c_aa00_b9c6ad976176.slice/crio-20f81f15954e5874761e760b61166ca315d63ff3b66ceda0369161d69be32dfe WatchSource:0}: Error finding container 20f81f15954e5874761e760b61166ca315d63ff3b66ceda0369161d69be32dfe: Status 404 returned error can't find the container with id 20f81f15954e5874761e760b61166ca315d63ff3b66ceda0369161d69be32dfe Apr 21 16:12:26.622981 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.622896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" event={"ID":"f9a16a79-6310-448c-aa00-b9c6ad976176","Type":"ContainerStarted","Data":"20f81f15954e5874761e760b61166ca315d63ff3b66ceda0369161d69be32dfe"} Apr 21 16:12:26.649263 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.649233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mckvx"] Apr 21 16:12:26.653845 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.653825 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" Apr 21 16:12:26.656408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.656384 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-z5n8p\"" Apr 21 16:12:26.659385 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.659364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mckvx"] Apr 21 16:12:26.772060 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.772029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbc44\" (UniqueName: \"kubernetes.io/projected/bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5-kube-api-access-gbc44\") pod \"authorino-f99f4b5cd-mckvx\" (UID: \"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5\") " pod="kuadrant-system/authorino-f99f4b5cd-mckvx" Apr 21 16:12:26.789973 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.789927 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-4kwxt"] Apr 21 16:12:26.793144 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.793127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-4kwxt" Apr 21 16:12:26.801228 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.801208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-4kwxt"] Apr 21 16:12:26.872962 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.872940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbc44\" (UniqueName: \"kubernetes.io/projected/bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5-kube-api-access-gbc44\") pod \"authorino-f99f4b5cd-mckvx\" (UID: \"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5\") " pod="kuadrant-system/authorino-f99f4b5cd-mckvx" Apr 21 16:12:26.882175 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.882119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbc44\" (UniqueName: \"kubernetes.io/projected/bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5-kube-api-access-gbc44\") pod \"authorino-f99f4b5cd-mckvx\" (UID: \"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5\") " pod="kuadrant-system/authorino-f99f4b5cd-mckvx" Apr 21 16:12:26.965261 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.965234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" Apr 21 16:12:26.974065 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:26.974045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cm4c\" (UniqueName: \"kubernetes.io/projected/d83af0a3-17d3-471d-9842-e5010cc9536a-kube-api-access-4cm4c\") pod \"authorino-7498df8756-4kwxt\" (UID: \"d83af0a3-17d3-471d-9842-e5010cc9536a\") " pod="kuadrant-system/authorino-7498df8756-4kwxt" Apr 21 16:12:27.075208 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:27.075175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cm4c\" (UniqueName: \"kubernetes.io/projected/d83af0a3-17d3-471d-9842-e5010cc9536a-kube-api-access-4cm4c\") pod \"authorino-7498df8756-4kwxt\" (UID: \"d83af0a3-17d3-471d-9842-e5010cc9536a\") " pod="kuadrant-system/authorino-7498df8756-4kwxt" Apr 21 16:12:27.084369 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:27.084283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cm4c\" (UniqueName: \"kubernetes.io/projected/d83af0a3-17d3-471d-9842-e5010cc9536a-kube-api-access-4cm4c\") pod \"authorino-7498df8756-4kwxt\" (UID: \"d83af0a3-17d3-471d-9842-e5010cc9536a\") " pod="kuadrant-system/authorino-7498df8756-4kwxt" Apr 21 16:12:27.086345 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:27.086299 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mckvx"] Apr 21 16:12:27.088874 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:12:27.088547 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd7cb6ea_8b59_4ff1_9cff_d14b8dfd8aa5.slice/crio-577940ab00e6e8c977e2c16229a351063dde51863113fdbc0c1b37901f4424f6 WatchSource:0}: Error finding container 577940ab00e6e8c977e2c16229a351063dde51863113fdbc0c1b37901f4424f6: Status 404 returned error can't find the container with id 577940ab00e6e8c977e2c16229a351063dde51863113fdbc0c1b37901f4424f6 Apr 21 16:12:27.102892 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:27.102840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-4kwxt" Apr 21 16:12:27.288274 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:27.287918 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-4kwxt"] Apr 21 16:12:27.629487 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:27.629390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-4kwxt" event={"ID":"d83af0a3-17d3-471d-9842-e5010cc9536a","Type":"ContainerStarted","Data":"df65dae55a2e2a70ac26fa2066b86f90e6c45e96b019d3194196b0b02a83251e"} Apr 21 16:12:27.630672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:27.630642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" event={"ID":"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5","Type":"ContainerStarted","Data":"577940ab00e6e8c977e2c16229a351063dde51863113fdbc0c1b37901f4424f6"} Apr 21 16:12:29.643928 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:29.643485 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" event={"ID":"f9a16a79-6310-448c-aa00-b9c6ad976176","Type":"ContainerStarted","Data":"e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03"} Apr 21 16:12:29.644341 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:29.644290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:29.665412 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:29.664027 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" podStartSLOduration=1.627645682 podStartE2EDuration="4.664009792s" podCreationTimestamp="2026-04-21 16:12:25 +0000 UTC" firstStartedPulling="2026-04-21 16:12:26.358563512 +0000 UTC m=+634.734745300" lastFinishedPulling="2026-04-21 16:12:29.394927621 +0000 UTC m=+637.771109410" observedRunningTime="2026-04-21 16:12:29.6609449 +0000 UTC m=+638.037126717" watchObservedRunningTime="2026-04-21 16:12:29.664009792 +0000 UTC m=+638.040191604" Apr 21 16:12:32.658166 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:32.658126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-4kwxt" event={"ID":"d83af0a3-17d3-471d-9842-e5010cc9536a","Type":"ContainerStarted","Data":"ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd"} Apr 21 16:12:32.659471 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:32.659440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" event={"ID":"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5","Type":"ContainerStarted","Data":"1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524"} Apr 21 16:12:32.674586 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:32.674545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-4kwxt" podStartSLOduration=1.870196277 podStartE2EDuration="6.674530947s" podCreationTimestamp="2026-04-21 16:12:26 +0000 UTC" firstStartedPulling="2026-04-21 16:12:27.293089179 +0000 UTC m=+635.669270974" lastFinishedPulling="2026-04-21 16:12:32.097423848 +0000 UTC m=+640.473605644" observedRunningTime="2026-04-21 16:12:32.672126638 +0000 UTC m=+641.048308450" watchObservedRunningTime="2026-04-21 16:12:32.674530947 +0000 UTC m=+641.050712736" Apr 21 16:12:32.688554 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:32.688506 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" podStartSLOduration=1.692864844 podStartE2EDuration="6.688489947s" podCreationTimestamp="2026-04-21 16:12:26 +0000 UTC" firstStartedPulling="2026-04-21 16:12:27.090222016 +0000 UTC m=+635.466403805" lastFinishedPulling="2026-04-21 16:12:32.085847106 +0000 UTC m=+640.462028908" observedRunningTime="2026-04-21 16:12:32.685705988 +0000 UTC m=+641.061887801" watchObservedRunningTime="2026-04-21 16:12:32.688489947 +0000 UTC m=+641.064671759" Apr 21 16:12:32.709786 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:32.709755 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mckvx"] Apr 21 16:12:34.667407 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:34.667360 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" podUID="bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5" containerName="authorino" containerID="cri-o://1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524" gracePeriod=30 Apr 21 16:12:34.913515 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:34.913491 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" Apr 21 16:12:34.949003 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:34.948934 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbc44\" (UniqueName: \"kubernetes.io/projected/bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5-kube-api-access-gbc44\") pod \"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5\" (UID: \"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5\") " Apr 21 16:12:34.950848 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:34.950822 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5-kube-api-access-gbc44" (OuterVolumeSpecName: "kube-api-access-gbc44") pod "bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5" (UID: "bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5"). InnerVolumeSpecName "kube-api-access-gbc44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:12:35.050228 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.050205 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbc44\" (UniqueName: \"kubernetes.io/projected/bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5-kube-api-access-gbc44\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:12:35.672617 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.672584 2576 generic.go:358] "Generic (PLEG): container finished" podID="bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5" containerID="1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524" exitCode=0 Apr 21 16:12:35.673091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.672631 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" Apr 21 16:12:35.673091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.672670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" event={"ID":"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5","Type":"ContainerDied","Data":"1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524"} Apr 21 16:12:35.673091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.672714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mckvx" event={"ID":"bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5","Type":"ContainerDied","Data":"577940ab00e6e8c977e2c16229a351063dde51863113fdbc0c1b37901f4424f6"} Apr 21 16:12:35.673091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.672736 2576 scope.go:117] "RemoveContainer" containerID="1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524" Apr 21 16:12:35.682507 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.682483 2576 scope.go:117] "RemoveContainer" containerID="1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524" Apr 21 16:12:35.682799 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:12:35.682782 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524\": container with ID starting with 1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524 not found: ID does not exist" containerID="1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524" Apr 21 16:12:35.682871 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.682807 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524"} err="failed to get container status \"1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524\": rpc error: code = NotFound desc = could not find container \"1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524\": container with ID starting with 1a1d4bc8ce9c34b76d0ca787c0841a1f2e74f656a1511741815a3bdb8efc8524 not found: ID does not exist" Apr 21 16:12:35.695033 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.695004 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mckvx"] Apr 21 16:12:35.697369 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:35.697348 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mckvx"] Apr 21 16:12:36.254877 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:36.254831 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5" path="/var/lib/kubelet/pods/bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5/volumes" Apr 21 16:12:41.653996 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:41.653965 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:42.040773 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.040693 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kc6z5"] Apr 21 16:12:42.040943 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.040918 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" podUID="f9a16a79-6310-448c-aa00-b9c6ad976176" containerName="limitador" containerID="cri-o://e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03" gracePeriod=30 Apr 21 16:12:42.589611 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.589590 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:42.707989 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.707959 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9a16a79-6310-448c-aa00-b9c6ad976176" containerID="e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03" exitCode=0 Apr 21 16:12:42.708420 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.708020 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" Apr 21 16:12:42.708420 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.708050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" event={"ID":"f9a16a79-6310-448c-aa00-b9c6ad976176","Type":"ContainerDied","Data":"e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03"} Apr 21 16:12:42.708420 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.708097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kc6z5" event={"ID":"f9a16a79-6310-448c-aa00-b9c6ad976176","Type":"ContainerDied","Data":"20f81f15954e5874761e760b61166ca315d63ff3b66ceda0369161d69be32dfe"} Apr 21 16:12:42.708420 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.708114 2576 scope.go:117] "RemoveContainer" containerID="e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03" Apr 21 16:12:42.716150 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.716121 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f9a16a79-6310-448c-aa00-b9c6ad976176-config-file\") pod \"f9a16a79-6310-448c-aa00-b9c6ad976176\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " Apr 21 16:12:42.716378 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.716244 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69qhb\" (UniqueName: \"kubernetes.io/projected/f9a16a79-6310-448c-aa00-b9c6ad976176-kube-api-access-69qhb\") pod \"f9a16a79-6310-448c-aa00-b9c6ad976176\" (UID: \"f9a16a79-6310-448c-aa00-b9c6ad976176\") " Apr 21 16:12:42.716488 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.716467 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a16a79-6310-448c-aa00-b9c6ad976176-config-file" (OuterVolumeSpecName: "config-file") pod "f9a16a79-6310-448c-aa00-b9c6ad976176" (UID: "f9a16a79-6310-448c-aa00-b9c6ad976176"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:12:42.716702 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.716688 2576 scope.go:117] "RemoveContainer" containerID="e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03" Apr 21 16:12:42.716961 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:12:42.716935 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03\": container with ID starting with e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03 not found: ID does not exist" containerID="e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03" Apr 21 16:12:42.717060 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.716968 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03"} err="failed to get container status \"e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03\": rpc error: code = NotFound desc = could not find container \"e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03\": container with ID starting with e0c27bbbcb2ad56a5604b5b9bd37d9a66e812b1cd1ae5adf7196d06379769b03 not found: ID does not exist" Apr 21 16:12:42.718205 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.718184 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a16a79-6310-448c-aa00-b9c6ad976176-kube-api-access-69qhb" (OuterVolumeSpecName: "kube-api-access-69qhb") pod "f9a16a79-6310-448c-aa00-b9c6ad976176" (UID: "f9a16a79-6310-448c-aa00-b9c6ad976176"). InnerVolumeSpecName "kube-api-access-69qhb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:12:42.817241 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.817214 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f9a16a79-6310-448c-aa00-b9c6ad976176-config-file\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:12:42.817241 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:42.817239 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69qhb\" (UniqueName: \"kubernetes.io/projected/f9a16a79-6310-448c-aa00-b9c6ad976176-kube-api-access-69qhb\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:12:43.031618 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:43.031587 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kc6z5"] Apr 21 16:12:43.037553 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:43.037520 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kc6z5"] Apr 21 16:12:44.252790 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:44.252755 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a16a79-6310-448c-aa00-b9c6ad976176" path="/var/lib/kubelet/pods/f9a16a79-6310-448c-aa00-b9c6ad976176/volumes" Apr 21 16:12:46.894064 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.894030 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-xjfcw"] Apr 21 16:12:46.894605 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.894583 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9a16a79-6310-448c-aa00-b9c6ad976176" containerName="limitador" Apr 21 16:12:46.894731 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.894608 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a16a79-6310-448c-aa00-b9c6ad976176" containerName="limitador" Apr 21 16:12:46.894731 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.894620 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5" containerName="authorino" Apr 21 16:12:46.894731 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.894628 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5" containerName="authorino" Apr 21 16:12:46.894909 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.894775 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd7cb6ea-8b59-4ff1-9cff-d14b8dfd8aa5" containerName="authorino" Apr 21 16:12:46.894909 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.894790 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9a16a79-6310-448c-aa00-b9c6ad976176" containerName="limitador" Apr 21 16:12:46.897831 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.897808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:46.900883 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.900837 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 16:12:46.901011 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.900904 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-mv7fk\"" Apr 21 16:12:46.908402 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.908368 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-xjfcw"] Apr 21 16:12:46.946833 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.946811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fda97a47-c138-4a75-b145-85d16968d1eb-data\") pod \"postgres-868db5846d-xjfcw\" (UID: \"fda97a47-c138-4a75-b145-85d16968d1eb\") " pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:46.946957 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:46.946935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvcxw\" (UniqueName: \"kubernetes.io/projected/fda97a47-c138-4a75-b145-85d16968d1eb-kube-api-access-hvcxw\") pod \"postgres-868db5846d-xjfcw\" (UID: \"fda97a47-c138-4a75-b145-85d16968d1eb\") " pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:47.047848 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:47.047815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvcxw\" (UniqueName: \"kubernetes.io/projected/fda97a47-c138-4a75-b145-85d16968d1eb-kube-api-access-hvcxw\") pod \"postgres-868db5846d-xjfcw\" (UID: \"fda97a47-c138-4a75-b145-85d16968d1eb\") " pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:47.048038 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:47.047905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fda97a47-c138-4a75-b145-85d16968d1eb-data\") pod \"postgres-868db5846d-xjfcw\" (UID: \"fda97a47-c138-4a75-b145-85d16968d1eb\") " pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:47.048262 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:47.048242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fda97a47-c138-4a75-b145-85d16968d1eb-data\") pod \"postgres-868db5846d-xjfcw\" (UID: \"fda97a47-c138-4a75-b145-85d16968d1eb\") " pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:47.062992 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:47.062962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvcxw\" (UniqueName: \"kubernetes.io/projected/fda97a47-c138-4a75-b145-85d16968d1eb-kube-api-access-hvcxw\") pod \"postgres-868db5846d-xjfcw\" (UID: \"fda97a47-c138-4a75-b145-85d16968d1eb\") " pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:47.211460 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:47.211270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:47.401423 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:47.401371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-xjfcw"] Apr 21 16:12:47.404669 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:12:47.404603 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfda97a47_c138_4a75_b145_85d16968d1eb.slice/crio-64716425cad00c2a57c3549d5916b8667e83848733e609880ecc52124468adb2 WatchSource:0}: Error finding container 64716425cad00c2a57c3549d5916b8667e83848733e609880ecc52124468adb2: Status 404 returned error can't find the container with id 64716425cad00c2a57c3549d5916b8667e83848733e609880ecc52124468adb2 Apr 21 16:12:47.732399 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:47.732333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-xjfcw" event={"ID":"fda97a47-c138-4a75-b145-85d16968d1eb","Type":"ContainerStarted","Data":"64716425cad00c2a57c3549d5916b8667e83848733e609880ecc52124468adb2"} Apr 21 16:12:52.180194 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:52.180170 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 16:12:52.762291 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:52.762254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-xjfcw" event={"ID":"fda97a47-c138-4a75-b145-85d16968d1eb","Type":"ContainerStarted","Data":"ee18d1d332f8f9f5de532019985c6e98fd890ff00016d0a188236e2e7e0a92e0"} Apr 21 16:12:52.762450 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:52.762362 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:52.786910 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:52.786853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-xjfcw" podStartSLOduration=2.01746045 podStartE2EDuration="6.786840606s" podCreationTimestamp="2026-04-21 16:12:46 +0000 UTC" firstStartedPulling="2026-04-21 16:12:47.406596541 +0000 UTC m=+655.782778331" lastFinishedPulling="2026-04-21 16:12:52.175976698 +0000 UTC m=+660.552158487" observedRunningTime="2026-04-21 16:12:52.785579098 +0000 UTC m=+661.161760911" watchObservedRunningTime="2026-04-21 16:12:52.786840606 +0000 UTC m=+661.163022416" Apr 21 16:12:58.807624 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:58.807569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-xjfcw" Apr 21 16:12:59.343285 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.343256 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vwcmj"] Apr 21 16:12:59.351570 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.351543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vwcmj" Apr 21 16:12:59.352269 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.352243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vwcmj"] Apr 21 16:12:59.452720 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.452695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzmd\" (UniqueName: \"kubernetes.io/projected/fb95e6d1-295f-4319-96f4-e8f7406629d1-kube-api-access-6mzmd\") pod \"authorino-8b475cf9f-vwcmj\" (UID: \"fb95e6d1-295f-4319-96f4-e8f7406629d1\") " pod="kuadrant-system/authorino-8b475cf9f-vwcmj" Apr 21 16:12:59.545777 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.545752 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vwcmj"] Apr 21 16:12:59.545982 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:12:59.545961 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-6mzmd], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-vwcmj" podUID="fb95e6d1-295f-4319-96f4-e8f7406629d1" Apr 21 16:12:59.553627 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.553604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzmd\" (UniqueName: \"kubernetes.io/projected/fb95e6d1-295f-4319-96f4-e8f7406629d1-kube-api-access-6mzmd\") pod \"authorino-8b475cf9f-vwcmj\" (UID: \"fb95e6d1-295f-4319-96f4-e8f7406629d1\") " pod="kuadrant-system/authorino-8b475cf9f-vwcmj" Apr 21 16:12:59.566023 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.565997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzmd\" (UniqueName: \"kubernetes.io/projected/fb95e6d1-295f-4319-96f4-e8f7406629d1-kube-api-access-6mzmd\") pod \"authorino-8b475cf9f-vwcmj\" (UID: \"fb95e6d1-295f-4319-96f4-e8f7406629d1\") " pod="kuadrant-system/authorino-8b475cf9f-vwcmj" Apr 21 16:12:59.570197 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.570176 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-cpzg4"] Apr 21 16:12:59.573770 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.573752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.576292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.576274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 16:12:59.585903 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.585877 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-cpzg4"] Apr 21 16:12:59.607518 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.607464 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-cpzg4"] Apr 21 16:12:59.607673 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:12:59.607655 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-dc65b tls-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-cpzg4" podUID="14995f34-d2f9-4039-91b9-b6ba9a14e449" Apr 21 16:12:59.635443 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.635420 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-57c6774d59-6tk24"] Apr 21 16:12:59.641256 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.641229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.646812 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.646786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57c6774d59-6tk24"] Apr 21 16:12:59.654126 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.654104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc65b\" (UniqueName: \"kubernetes.io/projected/14995f34-d2f9-4039-91b9-b6ba9a14e449-kube-api-access-dc65b\") pod \"authorino-56fdd757f5-cpzg4\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.654209 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.654164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/14995f34-d2f9-4039-91b9-b6ba9a14e449-tls-cert\") pod \"authorino-56fdd757f5-cpzg4\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.754476 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.754444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1708d9cf-8896-4a8c-be00-7d465c7eef29-tls-cert\") pod \"authorino-57c6774d59-6tk24\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.754593 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.754494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc65b\" (UniqueName: \"kubernetes.io/projected/14995f34-d2f9-4039-91b9-b6ba9a14e449-kube-api-access-dc65b\") pod \"authorino-56fdd757f5-cpzg4\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.754593 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.754521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/14995f34-d2f9-4039-91b9-b6ba9a14e449-tls-cert\") pod \"authorino-56fdd757f5-cpzg4\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.754593 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.754568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c5hk\" (UniqueName: \"kubernetes.io/projected/1708d9cf-8896-4a8c-be00-7d465c7eef29-kube-api-access-7c5hk\") pod \"authorino-57c6774d59-6tk24\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.756889 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.756848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/14995f34-d2f9-4039-91b9-b6ba9a14e449-tls-cert\") pod \"authorino-56fdd757f5-cpzg4\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.762374 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.762352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc65b\" (UniqueName: \"kubernetes.io/projected/14995f34-d2f9-4039-91b9-b6ba9a14e449-kube-api-access-dc65b\") pod \"authorino-56fdd757f5-cpzg4\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.792251 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.792230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.792338 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.792230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vwcmj" Apr 21 16:12:59.796669 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.796652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:12:59.799840 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.799819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vwcmj" Apr 21 16:12:59.854961 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.854935 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mzmd\" (UniqueName: \"kubernetes.io/projected/fb95e6d1-295f-4319-96f4-e8f7406629d1-kube-api-access-6mzmd\") pod \"fb95e6d1-295f-4319-96f4-e8f7406629d1\" (UID: \"fb95e6d1-295f-4319-96f4-e8f7406629d1\") " Apr 21 16:12:59.855318 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.854991 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc65b\" (UniqueName: \"kubernetes.io/projected/14995f34-d2f9-4039-91b9-b6ba9a14e449-kube-api-access-dc65b\") pod \"14995f34-d2f9-4039-91b9-b6ba9a14e449\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " Apr 21 16:12:59.855318 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.855017 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/14995f34-d2f9-4039-91b9-b6ba9a14e449-tls-cert\") pod \"14995f34-d2f9-4039-91b9-b6ba9a14e449\" (UID: \"14995f34-d2f9-4039-91b9-b6ba9a14e449\") " Apr 21 16:12:59.855318 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.855107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1708d9cf-8896-4a8c-be00-7d465c7eef29-tls-cert\") pod \"authorino-57c6774d59-6tk24\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.855318 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.855161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c5hk\" (UniqueName: \"kubernetes.io/projected/1708d9cf-8896-4a8c-be00-7d465c7eef29-kube-api-access-7c5hk\") pod \"authorino-57c6774d59-6tk24\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.857065 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.857042 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14995f34-d2f9-4039-91b9-b6ba9a14e449-kube-api-access-dc65b" (OuterVolumeSpecName: "kube-api-access-dc65b") pod "14995f34-d2f9-4039-91b9-b6ba9a14e449" (UID: "14995f34-d2f9-4039-91b9-b6ba9a14e449"). InnerVolumeSpecName "kube-api-access-dc65b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:12:59.857141 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.857057 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb95e6d1-295f-4319-96f4-e8f7406629d1-kube-api-access-6mzmd" (OuterVolumeSpecName: "kube-api-access-6mzmd") pod "fb95e6d1-295f-4319-96f4-e8f7406629d1" (UID: "fb95e6d1-295f-4319-96f4-e8f7406629d1"). InnerVolumeSpecName "kube-api-access-6mzmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:12:59.857408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.857394 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14995f34-d2f9-4039-91b9-b6ba9a14e449-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "14995f34-d2f9-4039-91b9-b6ba9a14e449" (UID: "14995f34-d2f9-4039-91b9-b6ba9a14e449"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:12:59.857764 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.857715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1708d9cf-8896-4a8c-be00-7d465c7eef29-tls-cert\") pod \"authorino-57c6774d59-6tk24\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.862613 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.862590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c5hk\" (UniqueName: \"kubernetes.io/projected/1708d9cf-8896-4a8c-be00-7d465c7eef29-kube-api-access-7c5hk\") pod \"authorino-57c6774d59-6tk24\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.951572 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.951551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:12:59.956303 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.956283 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dc65b\" (UniqueName: \"kubernetes.io/projected/14995f34-d2f9-4039-91b9-b6ba9a14e449-kube-api-access-dc65b\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:12:59.956391 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.956304 2576 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/14995f34-d2f9-4039-91b9-b6ba9a14e449-tls-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:12:59.956391 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:12:59.956316 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6mzmd\" (UniqueName: \"kubernetes.io/projected/fb95e6d1-295f-4319-96f4-e8f7406629d1-kube-api-access-6mzmd\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:13:00.071729 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.071705 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57c6774d59-6tk24"] Apr 21 16:13:00.073728 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:13:00.073699 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1708d9cf_8896_4a8c_be00_7d465c7eef29.slice/crio-70bda40205a4841dc67faba7450be2fa029314d0cd59277fa504c0a804050fb8 WatchSource:0}: Error finding container 70bda40205a4841dc67faba7450be2fa029314d0cd59277fa504c0a804050fb8: Status 404 returned error can't find the container with id 70bda40205a4841dc67faba7450be2fa029314d0cd59277fa504c0a804050fb8 Apr 21 16:13:00.797640 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.797544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57c6774d59-6tk24" event={"ID":"1708d9cf-8896-4a8c-be00-7d465c7eef29","Type":"ContainerStarted","Data":"cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac"} Apr 21 16:13:00.797640 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.797588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57c6774d59-6tk24" event={"ID":"1708d9cf-8896-4a8c-be00-7d465c7eef29","Type":"ContainerStarted","Data":"70bda40205a4841dc67faba7450be2fa029314d0cd59277fa504c0a804050fb8"} Apr 21 16:13:00.797640 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.797635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-cpzg4" Apr 21 16:13:00.797918 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.797741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vwcmj" Apr 21 16:13:00.816981 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.816937 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-57c6774d59-6tk24" podStartSLOduration=1.429488423 podStartE2EDuration="1.816926817s" podCreationTimestamp="2026-04-21 16:12:59 +0000 UTC" firstStartedPulling="2026-04-21 16:13:00.075029197 +0000 UTC m=+668.451210986" lastFinishedPulling="2026-04-21 16:13:00.462467578 +0000 UTC m=+668.838649380" observedRunningTime="2026-04-21 16:13:00.815840853 +0000 UTC m=+669.192022664" watchObservedRunningTime="2026-04-21 16:13:00.816926817 +0000 UTC m=+669.193108663" Apr 21 16:13:00.844136 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.844111 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-4kwxt"] Apr 21 16:13:00.844286 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.844268 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-4kwxt" podUID="d83af0a3-17d3-471d-9842-e5010cc9536a" containerName="authorino" containerID="cri-o://ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd" gracePeriod=30 Apr 21 16:13:00.850517 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.850496 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vwcmj"] Apr 21 16:13:00.857005 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.856981 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vwcmj"] Apr 21 16:13:00.881619 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.881594 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-cpzg4"] Apr 21 16:13:00.885162 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:00.885142 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-cpzg4"] Apr 21 16:13:01.084172 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.084143 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-4kwxt" Apr 21 16:13:01.167699 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.167674 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cm4c\" (UniqueName: \"kubernetes.io/projected/d83af0a3-17d3-471d-9842-e5010cc9536a-kube-api-access-4cm4c\") pod \"d83af0a3-17d3-471d-9842-e5010cc9536a\" (UID: \"d83af0a3-17d3-471d-9842-e5010cc9536a\") " Apr 21 16:13:01.169698 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.169669 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83af0a3-17d3-471d-9842-e5010cc9536a-kube-api-access-4cm4c" (OuterVolumeSpecName: "kube-api-access-4cm4c") pod "d83af0a3-17d3-471d-9842-e5010cc9536a" (UID: "d83af0a3-17d3-471d-9842-e5010cc9536a"). InnerVolumeSpecName "kube-api-access-4cm4c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:13:01.269031 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.268990 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cm4c\" (UniqueName: \"kubernetes.io/projected/d83af0a3-17d3-471d-9842-e5010cc9536a-kube-api-access-4cm4c\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:13:01.628091 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.628045 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-4hmfn"] Apr 21 16:13:01.628496 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.628479 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d83af0a3-17d3-471d-9842-e5010cc9536a" containerName="authorino" Apr 21 16:13:01.628588 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.628500 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83af0a3-17d3-471d-9842-e5010cc9536a" containerName="authorino" Apr 21 16:13:01.628645 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.628615 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d83af0a3-17d3-471d-9842-e5010cc9536a" containerName="authorino" Apr 21 16:13:01.632164 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.632144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:01.634992 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.634969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hr5hm\"" Apr 21 16:13:01.649838 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.649815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-4hmfn"] Apr 21 16:13:01.772958 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.772927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbzt\" (UniqueName: \"kubernetes.io/projected/0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4-kube-api-access-8gbzt\") pod \"maas-controller-6d4c8f55f9-4hmfn\" (UID: \"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4\") " pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:01.802142 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.802112 2576 generic.go:358] "Generic (PLEG): container finished" podID="d83af0a3-17d3-471d-9842-e5010cc9536a" containerID="ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd" exitCode=0 Apr 21 16:13:01.802356 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.802323 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-4kwxt" Apr 21 16:13:01.802356 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.802322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-4kwxt" event={"ID":"d83af0a3-17d3-471d-9842-e5010cc9536a","Type":"ContainerDied","Data":"ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd"} Apr 21 16:13:01.802548 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.802377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-4kwxt" event={"ID":"d83af0a3-17d3-471d-9842-e5010cc9536a","Type":"ContainerDied","Data":"df65dae55a2e2a70ac26fa2066b86f90e6c45e96b019d3194196b0b02a83251e"} Apr 21 16:13:01.802548 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.802400 2576 scope.go:117] "RemoveContainer" containerID="ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd" Apr 21 16:13:01.805505 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.805481 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-86c5747dbd-62jq4"] Apr 21 16:13:01.809172 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.809145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:01.811942 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.811921 2576 scope.go:117] "RemoveContainer" containerID="ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd" Apr 21 16:13:01.812192 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:13:01.812172 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd\": container with ID starting with ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd not found: ID does not exist" containerID="ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd" Apr 21 16:13:01.812254 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.812200 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd"} err="failed to get container status \"ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd\": rpc error: code = NotFound desc = could not find container \"ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd\": container with ID starting with ee619fc3e77225d8ee37bdcb90259d8bf351699afe8d4b653c4865d805af58cd not found: ID does not exist" Apr 21 16:13:01.822019 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.821998 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86c5747dbd-62jq4"] Apr 21 16:13:01.838752 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.838725 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-4kwxt"] Apr 21 16:13:01.842690 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.842667 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-4kwxt"] Apr 21 16:13:01.873603 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.873580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zmpv\" (UniqueName: \"kubernetes.io/projected/e5959fd8-0f20-4dd4-8e69-26495d34918b-kube-api-access-9zmpv\") pod \"maas-controller-86c5747dbd-62jq4\" (UID: \"e5959fd8-0f20-4dd4-8e69-26495d34918b\") " pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:01.873930 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.873649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbzt\" (UniqueName: \"kubernetes.io/projected/0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4-kube-api-access-8gbzt\") pod \"maas-controller-6d4c8f55f9-4hmfn\" (UID: \"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4\") " pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:01.883006 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.882956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbzt\" (UniqueName: \"kubernetes.io/projected/0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4-kube-api-access-8gbzt\") pod \"maas-controller-6d4c8f55f9-4hmfn\" (UID: \"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4\") " pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:01.936793 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.936768 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-4hmfn"] Apr 21 16:13:01.937023 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.937011 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:01.974262 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.974228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zmpv\" (UniqueName: \"kubernetes.io/projected/e5959fd8-0f20-4dd4-8e69-26495d34918b-kube-api-access-9zmpv\") pod \"maas-controller-86c5747dbd-62jq4\" (UID: \"e5959fd8-0f20-4dd4-8e69-26495d34918b\") " pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:01.982377 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:01.982348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zmpv\" (UniqueName: \"kubernetes.io/projected/e5959fd8-0f20-4dd4-8e69-26495d34918b-kube-api-access-9zmpv\") pod \"maas-controller-86c5747dbd-62jq4\" (UID: \"e5959fd8-0f20-4dd4-8e69-26495d34918b\") " pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:02.062109 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.062080 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-4hmfn"] Apr 21 16:13:02.063195 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:13:02.063164 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6c2a61_29e5_4ea3_b9c8_4fe8e4bba7a4.slice/crio-84d11ba17def912da14a34d3389265a28c763cc50cdf11052ec1af1ad04372a8 WatchSource:0}: Error finding container 84d11ba17def912da14a34d3389265a28c763cc50cdf11052ec1af1ad04372a8: Status 404 returned error can't find the container with id 84d11ba17def912da14a34d3389265a28c763cc50cdf11052ec1af1ad04372a8 Apr 21 16:13:02.120381 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.120359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:02.243166 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.243141 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86c5747dbd-62jq4"] Apr 21 16:13:02.244269 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:13:02.244240 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5959fd8_0f20_4dd4_8e69_26495d34918b.slice/crio-58e8d052a2b4515df1c53ef06e5c73dce2322e2a42fd12b115a9ce0b05040ccd WatchSource:0}: Error finding container 58e8d052a2b4515df1c53ef06e5c73dce2322e2a42fd12b115a9ce0b05040ccd: Status 404 returned error can't find the container with id 58e8d052a2b4515df1c53ef06e5c73dce2322e2a42fd12b115a9ce0b05040ccd Apr 21 16:13:02.251843 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.251821 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14995f34-d2f9-4039-91b9-b6ba9a14e449" path="/var/lib/kubelet/pods/14995f34-d2f9-4039-91b9-b6ba9a14e449/volumes" Apr 21 16:13:02.252093 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.252078 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d83af0a3-17d3-471d-9842-e5010cc9536a" path="/var/lib/kubelet/pods/d83af0a3-17d3-471d-9842-e5010cc9536a/volumes" Apr 21 16:13:02.252379 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.252366 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb95e6d1-295f-4319-96f4-e8f7406629d1" path="/var/lib/kubelet/pods/fb95e6d1-295f-4319-96f4-e8f7406629d1/volumes" Apr 21 16:13:02.821673 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.821627 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" event={"ID":"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4","Type":"ContainerStarted","Data":"84d11ba17def912da14a34d3389265a28c763cc50cdf11052ec1af1ad04372a8"} Apr 21 16:13:02.824050 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:02.824019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c5747dbd-62jq4" event={"ID":"e5959fd8-0f20-4dd4-8e69-26495d34918b","Type":"ContainerStarted","Data":"58e8d052a2b4515df1c53ef06e5c73dce2322e2a42fd12b115a9ce0b05040ccd"} Apr 21 16:13:05.837163 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:05.837074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c5747dbd-62jq4" event={"ID":"e5959fd8-0f20-4dd4-8e69-26495d34918b","Type":"ContainerStarted","Data":"4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b"} Apr 21 16:13:05.837163 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:05.837124 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:05.838548 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:05.838523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" event={"ID":"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4","Type":"ContainerStarted","Data":"173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8"} Apr 21 16:13:05.838655 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:05.838603 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" podUID="0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4" containerName="manager" containerID="cri-o://173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8" gracePeriod=10 Apr 21 16:13:05.838655 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:05.838632 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:05.855060 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:05.855020 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-86c5747dbd-62jq4" podStartSLOduration=1.571110878 podStartE2EDuration="4.855008128s" podCreationTimestamp="2026-04-21 16:13:01 +0000 UTC" firstStartedPulling="2026-04-21 16:13:02.245657133 +0000 UTC m=+670.621838924" lastFinishedPulling="2026-04-21 16:13:05.529554382 +0000 UTC m=+673.905736174" observedRunningTime="2026-04-21 16:13:05.853341358 +0000 UTC m=+674.229523169" watchObservedRunningTime="2026-04-21 16:13:05.855008128 +0000 UTC m=+674.231189988" Apr 21 16:13:05.874250 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:05.874211 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" podStartSLOduration=1.416186637 podStartE2EDuration="4.874199977s" podCreationTimestamp="2026-04-21 16:13:01 +0000 UTC" firstStartedPulling="2026-04-21 16:13:02.064631131 +0000 UTC m=+670.440812930" lastFinishedPulling="2026-04-21 16:13:05.522644477 +0000 UTC m=+673.898826270" observedRunningTime="2026-04-21 16:13:05.872381073 +0000 UTC m=+674.248562908" watchObservedRunningTime="2026-04-21 16:13:05.874199977 +0000 UTC m=+674.250381787" Apr 21 16:13:06.091949 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.091886 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:06.215738 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.215711 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gbzt\" (UniqueName: \"kubernetes.io/projected/0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4-kube-api-access-8gbzt\") pod \"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4\" (UID: \"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4\") " Apr 21 16:13:06.217838 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.217816 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4-kube-api-access-8gbzt" (OuterVolumeSpecName: "kube-api-access-8gbzt") pod "0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4" (UID: "0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4"). InnerVolumeSpecName "kube-api-access-8gbzt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:13:06.317159 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.317137 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8gbzt\" (UniqueName: \"kubernetes.io/projected/0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4-kube-api-access-8gbzt\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:13:06.844032 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.843944 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4" containerID="173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8" exitCode=0 Apr 21 16:13:06.844032 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.844016 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" Apr 21 16:13:06.844576 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.844029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" event={"ID":"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4","Type":"ContainerDied","Data":"173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8"} Apr 21 16:13:06.844576 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.844063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-4hmfn" event={"ID":"0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4","Type":"ContainerDied","Data":"84d11ba17def912da14a34d3389265a28c763cc50cdf11052ec1af1ad04372a8"} Apr 21 16:13:06.844576 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.844078 2576 scope.go:117] "RemoveContainer" containerID="173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8" Apr 21 16:13:06.852879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.852844 2576 scope.go:117] "RemoveContainer" containerID="173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8" Apr 21 16:13:06.853112 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:13:06.853092 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8\": container with ID starting with 173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8 not found: ID does not exist" containerID="173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8" Apr 21 16:13:06.853969 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.853121 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8"} err="failed to get container status \"173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8\": rpc error: code = NotFound desc = could not find container \"173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8\": container with ID starting with 173e532bbc4a54364b4f82ad66837c491851e39dff41b0a55614e34e458f0bb8 not found: ID does not exist" Apr 21 16:13:06.864268 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.864242 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-4hmfn"] Apr 21 16:13:06.867922 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:06.867902 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-4hmfn"] Apr 21 16:13:07.105888 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.105813 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-c785cff45-jzws8"] Apr 21 16:13:07.106242 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.106228 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4" containerName="manager" Apr 21 16:13:07.106283 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.106244 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4" containerName="manager" Apr 21 16:13:07.106326 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.106305 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4" containerName="manager" Apr 21 16:13:07.110967 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.110951 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.113821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.113792 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 16:13:07.113946 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.113794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 16:13:07.113946 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.113905 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hhkxk\"" Apr 21 16:13:07.123129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.123108 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-c785cff45-jzws8"] Apr 21 16:13:07.225042 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.225015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ce6b668f-9f8e-47bf-9e33-10657870a4a1-maas-api-tls\") pod \"maas-api-c785cff45-jzws8\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.225195 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.225075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l4qs\" (UniqueName: \"kubernetes.io/projected/ce6b668f-9f8e-47bf-9e33-10657870a4a1-kube-api-access-7l4qs\") pod \"maas-api-c785cff45-jzws8\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.326253 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.326222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4qs\" (UniqueName: \"kubernetes.io/projected/ce6b668f-9f8e-47bf-9e33-10657870a4a1-kube-api-access-7l4qs\") pod \"maas-api-c785cff45-jzws8\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.326407 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.326333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ce6b668f-9f8e-47bf-9e33-10657870a4a1-maas-api-tls\") pod \"maas-api-c785cff45-jzws8\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.328888 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.328842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ce6b668f-9f8e-47bf-9e33-10657870a4a1-maas-api-tls\") pod \"maas-api-c785cff45-jzws8\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.334823 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.334803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4qs\" (UniqueName: \"kubernetes.io/projected/ce6b668f-9f8e-47bf-9e33-10657870a4a1-kube-api-access-7l4qs\") pod \"maas-api-c785cff45-jzws8\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.421631 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.421606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:07.753738 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.753714 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-c785cff45-jzws8"] Apr 21 16:13:07.755613 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:13:07.755582 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce6b668f_9f8e_47bf_9e33_10657870a4a1.slice/crio-21c0a0d12410939da64a78154d6c5ea8ca53f3cab4630cf1e9cc63ac9b6f5769 WatchSource:0}: Error finding container 21c0a0d12410939da64a78154d6c5ea8ca53f3cab4630cf1e9cc63ac9b6f5769: Status 404 returned error can't find the container with id 21c0a0d12410939da64a78154d6c5ea8ca53f3cab4630cf1e9cc63ac9b6f5769 Apr 21 16:13:07.849226 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:07.849188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-c785cff45-jzws8" event={"ID":"ce6b668f-9f8e-47bf-9e33-10657870a4a1","Type":"ContainerStarted","Data":"21c0a0d12410939da64a78154d6c5ea8ca53f3cab4630cf1e9cc63ac9b6f5769"} Apr 21 16:13:08.252993 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:08.252962 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4" path="/var/lib/kubelet/pods/0f6c2a61-29e5-4ea3-b9c8-4fe8e4bba7a4/volumes" Apr 21 16:13:09.862737 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:09.862665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-c785cff45-jzws8" event={"ID":"ce6b668f-9f8e-47bf-9e33-10657870a4a1","Type":"ContainerStarted","Data":"f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a"} Apr 21 16:13:09.863634 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:09.862836 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:09.890638 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:09.890495 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-c785cff45-jzws8" podStartSLOduration=1.503237017 podStartE2EDuration="2.890451082s" podCreationTimestamp="2026-04-21 16:13:07 +0000 UTC" firstStartedPulling="2026-04-21 16:13:07.757038713 +0000 UTC m=+676.133220502" lastFinishedPulling="2026-04-21 16:13:09.144252778 +0000 UTC m=+677.520434567" observedRunningTime="2026-04-21 16:13:09.886000561 +0000 UTC m=+678.262182381" watchObservedRunningTime="2026-04-21 16:13:09.890451082 +0000 UTC m=+678.266632896" Apr 21 16:13:15.871804 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:15.871774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:16.068525 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.068490 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-86c5747dbd-62jq4"] Apr 21 16:13:16.068777 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.068743 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-86c5747dbd-62jq4" podUID="e5959fd8-0f20-4dd4-8e69-26495d34918b" containerName="manager" containerID="cri-o://4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b" gracePeriod=10 Apr 21 16:13:16.074209 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.074185 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:16.325466 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.325441 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:16.512120 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.512044 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zmpv\" (UniqueName: \"kubernetes.io/projected/e5959fd8-0f20-4dd4-8e69-26495d34918b-kube-api-access-9zmpv\") pod \"e5959fd8-0f20-4dd4-8e69-26495d34918b\" (UID: \"e5959fd8-0f20-4dd4-8e69-26495d34918b\") " Apr 21 16:13:16.514102 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.514075 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5959fd8-0f20-4dd4-8e69-26495d34918b-kube-api-access-9zmpv" (OuterVolumeSpecName: "kube-api-access-9zmpv") pod "e5959fd8-0f20-4dd4-8e69-26495d34918b" (UID: "e5959fd8-0f20-4dd4-8e69-26495d34918b"). InnerVolumeSpecName "kube-api-access-9zmpv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:13:16.613636 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.613613 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9zmpv\" (UniqueName: \"kubernetes.io/projected/e5959fd8-0f20-4dd4-8e69-26495d34918b-kube-api-access-9zmpv\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:13:16.891647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.891620 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5959fd8-0f20-4dd4-8e69-26495d34918b" containerID="4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b" exitCode=0 Apr 21 16:13:16.892081 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.891682 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c5747dbd-62jq4" Apr 21 16:13:16.892081 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.891716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c5747dbd-62jq4" event={"ID":"e5959fd8-0f20-4dd4-8e69-26495d34918b","Type":"ContainerDied","Data":"4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b"} Apr 21 16:13:16.892081 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.891765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c5747dbd-62jq4" event={"ID":"e5959fd8-0f20-4dd4-8e69-26495d34918b","Type":"ContainerDied","Data":"58e8d052a2b4515df1c53ef06e5c73dce2322e2a42fd12b115a9ce0b05040ccd"} Apr 21 16:13:16.892081 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.891788 2576 scope.go:117] "RemoveContainer" containerID="4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b" Apr 21 16:13:16.901714 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.901695 2576 scope.go:117] "RemoveContainer" containerID="4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b" Apr 21 16:13:16.901995 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:13:16.901977 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b\": container with ID starting with 4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b not found: ID does not exist" containerID="4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b" Apr 21 16:13:16.902048 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.902003 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b"} err="failed to get container status \"4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b\": rpc error: code = NotFound desc = could not find container \"4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b\": container with ID starting with 4e331cd64b3a41e0b49288910231fdffb17c28ba1f68062a7cd58744acd2021b not found: ID does not exist" Apr 21 16:13:16.914508 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.914487 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-86c5747dbd-62jq4"] Apr 21 16:13:16.917613 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:16.917591 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-86c5747dbd-62jq4"] Apr 21 16:13:18.252455 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:18.252421 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5959fd8-0f20-4dd4-8e69-26495d34918b" path="/var/lib/kubelet/pods/e5959fd8-0f20-4dd4-8e69-26495d34918b/volumes" Apr 21 16:13:36.858575 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.858542 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s"] Apr 21 16:13:36.859147 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.859126 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5959fd8-0f20-4dd4-8e69-26495d34918b" containerName="manager" Apr 21 16:13:36.859231 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.859150 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5959fd8-0f20-4dd4-8e69-26495d34918b" containerName="manager" Apr 21 16:13:36.859294 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.859267 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5959fd8-0f20-4dd4-8e69-26495d34918b" containerName="manager" Apr 21 16:13:36.886240 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.886212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s"] Apr 21 16:13:36.886240 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.886240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:36.889115 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.889090 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 16:13:36.889240 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.889151 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 16:13:36.890043 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.890021 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-wpv9r\"" Apr 21 16:13:36.890180 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.890167 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 21 16:13:36.980167 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.980131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:36.980291 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.980188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:36.980291 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.980240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:36.980291 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.980262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:36.980402 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.980324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mf87\" (UniqueName: \"kubernetes.io/projected/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-kube-api-access-8mf87\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:36.980402 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:36.980390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081405 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081515 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081515 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081515 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mf87\" (UniqueName: \"kubernetes.io/projected/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-kube-api-access-8mf87\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081647 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081842 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.081842 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.082006 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.081905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.083754 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.083730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.083976 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.083961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.089121 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.089104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mf87\" (UniqueName: \"kubernetes.io/projected/d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee-kube-api-access-8mf87\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s\" (UID: \"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.195873 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.195835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:37.321643 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.321619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s"] Apr 21 16:13:37.323106 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:13:37.323076 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d24cbb_4859_4e5c_b04e_64ff7d22b3ee.slice/crio-a7e90f2839bceba45eb36a55fbf1d56dccf046f6f59dd61e99db6449d8903101 WatchSource:0}: Error finding container a7e90f2839bceba45eb36a55fbf1d56dccf046f6f59dd61e99db6449d8903101: Status 404 returned error can't find the container with id a7e90f2839bceba45eb36a55fbf1d56dccf046f6f59dd61e99db6449d8903101 Apr 21 16:13:37.973835 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:37.973797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" event={"ID":"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee","Type":"ContainerStarted","Data":"a7e90f2839bceba45eb36a55fbf1d56dccf046f6f59dd61e99db6449d8903101"} Apr 21 16:13:45.010039 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:45.010001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" event={"ID":"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee","Type":"ContainerStarted","Data":"6e5ded1b2fd41f02c3d3b215cd010288ae9dca7badd26f116608ed5ab23aebe5"} Apr 21 16:13:53.046127 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:53.046093 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee" containerID="6e5ded1b2fd41f02c3d3b215cd010288ae9dca7badd26f116608ed5ab23aebe5" exitCode=0 Apr 21 16:13:53.046590 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:53.046174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" event={"ID":"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee","Type":"ContainerDied","Data":"6e5ded1b2fd41f02c3d3b215cd010288ae9dca7badd26f116608ed5ab23aebe5"} Apr 21 16:13:55.057585 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:55.057548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" event={"ID":"d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee","Type":"ContainerStarted","Data":"f6b8b0f71f3b8a1fa78c92f94555d8d1117734e7153fbe293499bde05cd12d5d"} Apr 21 16:13:55.057997 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:55.057740 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:13:55.076434 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:55.076390 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" podStartSLOduration=1.979853079 podStartE2EDuration="19.076377327s" podCreationTimestamp="2026-04-21 16:13:36 +0000 UTC" firstStartedPulling="2026-04-21 16:13:37.324794092 +0000 UTC m=+705.700975881" lastFinishedPulling="2026-04-21 16:13:54.421318328 +0000 UTC m=+722.797500129" observedRunningTime="2026-04-21 16:13:55.07503443 +0000 UTC m=+723.451216244" watchObservedRunningTime="2026-04-21 16:13:55.076377327 +0000 UTC m=+723.452559138" Apr 21 16:13:56.308404 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.308363 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-c785cff45-jzws8"] Apr 21 16:13:56.308845 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.308625 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-c785cff45-jzws8" podUID="ce6b668f-9f8e-47bf-9e33-10657870a4a1" containerName="maas-api" containerID="cri-o://f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a" gracePeriod=30 Apr 21 16:13:56.549222 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.549204 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:56.666113 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.666091 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ce6b668f-9f8e-47bf-9e33-10657870a4a1-maas-api-tls\") pod \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " Apr 21 16:13:56.666305 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.666285 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l4qs\" (UniqueName: \"kubernetes.io/projected/ce6b668f-9f8e-47bf-9e33-10657870a4a1-kube-api-access-7l4qs\") pod \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\" (UID: \"ce6b668f-9f8e-47bf-9e33-10657870a4a1\") " Apr 21 16:13:56.668165 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.668137 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6b668f-9f8e-47bf-9e33-10657870a4a1-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "ce6b668f-9f8e-47bf-9e33-10657870a4a1" (UID: "ce6b668f-9f8e-47bf-9e33-10657870a4a1"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:13:56.668255 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.668204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6b668f-9f8e-47bf-9e33-10657870a4a1-kube-api-access-7l4qs" (OuterVolumeSpecName: "kube-api-access-7l4qs") pod "ce6b668f-9f8e-47bf-9e33-10657870a4a1" (UID: "ce6b668f-9f8e-47bf-9e33-10657870a4a1"). InnerVolumeSpecName "kube-api-access-7l4qs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:13:56.767166 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.767132 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7l4qs\" (UniqueName: \"kubernetes.io/projected/ce6b668f-9f8e-47bf-9e33-10657870a4a1-kube-api-access-7l4qs\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:13:56.767166 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.767165 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ce6b668f-9f8e-47bf-9e33-10657870a4a1-maas-api-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:13:56.855773 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.855745 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2"] Apr 21 16:13:56.856218 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.856200 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce6b668f-9f8e-47bf-9e33-10657870a4a1" containerName="maas-api" Apr 21 16:13:56.856324 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.856221 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6b668f-9f8e-47bf-9e33-10657870a4a1" containerName="maas-api" Apr 21 16:13:56.856391 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.856327 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce6b668f-9f8e-47bf-9e33-10657870a4a1" containerName="maas-api" Apr 21 16:13:56.862905 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.862885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:56.865474 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.865455 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 16:13:56.871563 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.871543 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2"] Apr 21 16:13:56.968203 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.968145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fd4\" (UniqueName: \"kubernetes.io/projected/9be06ddc-fd85-47f2-ad11-52c616571953-kube-api-access-b8fd4\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:56.968307 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.968193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:56.968307 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.968243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:56.968384 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.968338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:56.968384 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.968361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:56.968384 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:56.968377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9be06ddc-fd85-47f2-ad11-52c616571953-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.066606 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.066574 2576 generic.go:358] "Generic (PLEG): container finished" podID="ce6b668f-9f8e-47bf-9e33-10657870a4a1" containerID="f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a" exitCode=0 Apr 21 16:13:57.066760 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.066617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-c785cff45-jzws8" event={"ID":"ce6b668f-9f8e-47bf-9e33-10657870a4a1","Type":"ContainerDied","Data":"f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a"} Apr 21 16:13:57.066760 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.066641 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-c785cff45-jzws8" Apr 21 16:13:57.066760 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.066660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-c785cff45-jzws8" event={"ID":"ce6b668f-9f8e-47bf-9e33-10657870a4a1","Type":"ContainerDied","Data":"21c0a0d12410939da64a78154d6c5ea8ca53f3cab4630cf1e9cc63ac9b6f5769"} Apr 21 16:13:57.066760 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.066674 2576 scope.go:117] "RemoveContainer" containerID="f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a" Apr 21 16:13:57.069039 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069152 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069152 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069152 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9be06ddc-fd85-47f2-ad11-52c616571953-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fd4\" (UniqueName: \"kubernetes.io/projected/9be06ddc-fd85-47f2-ad11-52c616571953-kube-api-access-b8fd4\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069472 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069655 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.069771 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.069702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.071807 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.071787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9be06ddc-fd85-47f2-ad11-52c616571953-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.072013 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.071995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9be06ddc-fd85-47f2-ad11-52c616571953-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.076360 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.076340 2576 scope.go:117] "RemoveContainer" containerID="f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a" Apr 21 16:13:57.076635 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:13:57.076616 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a\": container with ID starting with f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a not found: ID does not exist" containerID="f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a" Apr 21 16:13:57.076699 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.076645 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a"} err="failed to get container status \"f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a\": rpc error: code = NotFound desc = could not find container \"f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a\": container with ID starting with f2d82b07372c05320465312d60643f48dee2b0d8cd73d6d03902d9b5e24b4e1a not found: ID does not exist" Apr 21 16:13:57.077129 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.077109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fd4\" (UniqueName: \"kubernetes.io/projected/9be06ddc-fd85-47f2-ad11-52c616571953-kube-api-access-b8fd4\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2\" (UID: \"9be06ddc-fd85-47f2-ad11-52c616571953\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.090740 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.090691 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-c785cff45-jzws8"] Apr 21 16:13:57.093604 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.093570 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-c785cff45-jzws8"] Apr 21 16:13:57.175108 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.175013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:13:57.373281 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:57.373250 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2"] Apr 21 16:13:57.374275 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:13:57.374243 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9be06ddc_fd85_47f2_ad11_52c616571953.slice/crio-4af9495cb2a298087555ca72f9f5fc58f69a337cda28f0e226096cd0b864af6c WatchSource:0}: Error finding container 4af9495cb2a298087555ca72f9f5fc58f69a337cda28f0e226096cd0b864af6c: Status 404 returned error can't find the container with id 4af9495cb2a298087555ca72f9f5fc58f69a337cda28f0e226096cd0b864af6c Apr 21 16:13:58.072843 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:58.072805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" event={"ID":"9be06ddc-fd85-47f2-ad11-52c616571953","Type":"ContainerStarted","Data":"b74534c85a842732f8a5ddda83241ca42bb339149ae29915820165cf0064904a"} Apr 21 16:13:58.072843 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:58.072841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" event={"ID":"9be06ddc-fd85-47f2-ad11-52c616571953","Type":"ContainerStarted","Data":"4af9495cb2a298087555ca72f9f5fc58f69a337cda28f0e226096cd0b864af6c"} Apr 21 16:13:58.252036 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:13:58.252007 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6b668f-9f8e-47bf-9e33-10657870a4a1" path="/var/lib/kubelet/pods/ce6b668f-9f8e-47bf-9e33-10657870a4a1/volumes" Apr 21 16:14:02.154094 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.154061 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2"] Apr 21 16:14:02.174262 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.174236 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2"] Apr 21 16:14:02.174408 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.174354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.177053 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.177026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 21 16:14:02.333995 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.333962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.334148 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.333998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.334148 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.334075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.334148 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.334113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74be8ffa-3171-4ff2-883d-28515c021095-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.334148 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.334145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8snd\" (UniqueName: \"kubernetes.io/projected/74be8ffa-3171-4ff2-883d-28515c021095-kube-api-access-q8snd\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.334349 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.334211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.434767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.434692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.434767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.434732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74be8ffa-3171-4ff2-883d-28515c021095-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.434767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.434752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8snd\" (UniqueName: \"kubernetes.io/projected/74be8ffa-3171-4ff2-883d-28515c021095-kube-api-access-q8snd\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.435105 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.435017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.435173 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.435138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.435240 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.435198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.435290 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.435236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.435502 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.435480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.435634 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.435606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.437282 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.437259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74be8ffa-3171-4ff2-883d-28515c021095-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.437522 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.437501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/74be8ffa-3171-4ff2-883d-28515c021095-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.442329 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.442307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8snd\" (UniqueName: \"kubernetes.io/projected/74be8ffa-3171-4ff2-883d-28515c021095-kube-api-access-q8snd\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2\" (UID: \"74be8ffa-3171-4ff2-883d-28515c021095\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.485842 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.485820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:02.628777 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:02.628747 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2"] Apr 21 16:14:02.629975 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:14:02.629938 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74be8ffa_3171_4ff2_883d_28515c021095.slice/crio-3617d0334a1d4f52cbfe744bb0dce8d313e41ebec0d6a209eb564b7c1c982636 WatchSource:0}: Error finding container 3617d0334a1d4f52cbfe744bb0dce8d313e41ebec0d6a209eb564b7c1c982636: Status 404 returned error can't find the container with id 3617d0334a1d4f52cbfe744bb0dce8d313e41ebec0d6a209eb564b7c1c982636 Apr 21 16:14:03.093020 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:03.092933 2576 generic.go:358] "Generic (PLEG): container finished" podID="9be06ddc-fd85-47f2-ad11-52c616571953" containerID="b74534c85a842732f8a5ddda83241ca42bb339149ae29915820165cf0064904a" exitCode=0 Apr 21 16:14:03.093197 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:03.093017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" event={"ID":"9be06ddc-fd85-47f2-ad11-52c616571953","Type":"ContainerDied","Data":"b74534c85a842732f8a5ddda83241ca42bb339149ae29915820165cf0064904a"} Apr 21 16:14:03.094645 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:03.094623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" event={"ID":"74be8ffa-3171-4ff2-883d-28515c021095","Type":"ContainerStarted","Data":"25820302298d8adb9b5741fba343620da393626aa04d067aba15c84f3d8aa307"} Apr 21 16:14:03.094743 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:03.094653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" event={"ID":"74be8ffa-3171-4ff2-883d-28515c021095","Type":"ContainerStarted","Data":"3617d0334a1d4f52cbfe744bb0dce8d313e41ebec0d6a209eb564b7c1c982636"} Apr 21 16:14:04.100679 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:04.100588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" event={"ID":"9be06ddc-fd85-47f2-ad11-52c616571953","Type":"ContainerStarted","Data":"32ccc46d8050bf32bf51951a1cbe2f40b5f53bafdc808443c2fc971501c069c9"} Apr 21 16:14:04.121801 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:04.121756 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" podStartSLOduration=7.516343561 podStartE2EDuration="8.121740497s" podCreationTimestamp="2026-04-21 16:13:56 +0000 UTC" firstStartedPulling="2026-04-21 16:14:03.093791435 +0000 UTC m=+731.469973223" lastFinishedPulling="2026-04-21 16:14:03.699188356 +0000 UTC m=+732.075370159" observedRunningTime="2026-04-21 16:14:04.120447687 +0000 UTC m=+732.496629497" watchObservedRunningTime="2026-04-21 16:14:04.121740497 +0000 UTC m=+732.497922309" Apr 21 16:14:06.080368 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:06.080332 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s" Apr 21 16:14:09.122717 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:09.122681 2576 generic.go:358] "Generic (PLEG): container finished" podID="74be8ffa-3171-4ff2-883d-28515c021095" containerID="25820302298d8adb9b5741fba343620da393626aa04d067aba15c84f3d8aa307" exitCode=0 Apr 21 16:14:09.123161 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:09.122750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" event={"ID":"74be8ffa-3171-4ff2-883d-28515c021095","Type":"ContainerDied","Data":"25820302298d8adb9b5741fba343620da393626aa04d067aba15c84f3d8aa307"} Apr 21 16:14:10.129719 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:10.129630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" event={"ID":"74be8ffa-3171-4ff2-883d-28515c021095","Type":"ContainerStarted","Data":"0ccfa0546c8afca7fb74d07e0cac13d73df957f695ec3813cd413fd571c077c3"} Apr 21 16:14:10.130210 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:10.129919 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:14:10.154646 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:10.154577 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" podStartSLOduration=7.629078654 podStartE2EDuration="8.154561446s" podCreationTimestamp="2026-04-21 16:14:02 +0000 UTC" firstStartedPulling="2026-04-21 16:14:09.123328661 +0000 UTC m=+737.499510451" lastFinishedPulling="2026-04-21 16:14:09.648811454 +0000 UTC m=+738.024993243" observedRunningTime="2026-04-21 16:14:10.151879611 +0000 UTC m=+738.528061418" watchObservedRunningTime="2026-04-21 16:14:10.154561446 +0000 UTC m=+738.530743326" Apr 21 16:14:14.101871 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:14.101833 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:14:14.114641 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:14.114618 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2" Apr 21 16:14:21.147301 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:14:21.147268 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2" Apr 21 16:15:00.147939 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.147902 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613135-bjzmf"] Apr 21 16:15:00.152668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.152649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" Apr 21 16:15:00.155632 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.155615 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hhkxk\"" Apr 21 16:15:00.165828 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.165807 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613135-bjzmf"] Apr 21 16:15:00.314462 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.314426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkw7b\" (UniqueName: \"kubernetes.io/projected/fc660a0a-dff3-45f7-96ac-7c4ae7a51503-kube-api-access-tkw7b\") pod \"maas-api-key-cleanup-29613135-bjzmf\" (UID: \"fc660a0a-dff3-45f7-96ac-7c4ae7a51503\") " pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" Apr 21 16:15:00.415629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.415539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkw7b\" (UniqueName: \"kubernetes.io/projected/fc660a0a-dff3-45f7-96ac-7c4ae7a51503-kube-api-access-tkw7b\") pod \"maas-api-key-cleanup-29613135-bjzmf\" (UID: \"fc660a0a-dff3-45f7-96ac-7c4ae7a51503\") " pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" Apr 21 16:15:00.424527 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.424506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkw7b\" (UniqueName: \"kubernetes.io/projected/fc660a0a-dff3-45f7-96ac-7c4ae7a51503-kube-api-access-tkw7b\") pod \"maas-api-key-cleanup-29613135-bjzmf\" (UID: \"fc660a0a-dff3-45f7-96ac-7c4ae7a51503\") " pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" Apr 21 16:15:00.462998 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.462974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" Apr 21 16:15:00.790960 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:00.790928 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613135-bjzmf"] Apr 21 16:15:00.791258 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:15:00.791229 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc660a0a_dff3_45f7_96ac_7c4ae7a51503.slice/crio-e01519acedfcdba92f5b5e4c9cafc834d28d43f5c826db1f8403f5de80892637 WatchSource:0}: Error finding container e01519acedfcdba92f5b5e4c9cafc834d28d43f5c826db1f8403f5de80892637: Status 404 returned error can't find the container with id e01519acedfcdba92f5b5e4c9cafc834d28d43f5c826db1f8403f5de80892637 Apr 21 16:15:01.349584 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:01.349551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerStarted","Data":"e01519acedfcdba92f5b5e4c9cafc834d28d43f5c826db1f8403f5de80892637"} Apr 21 16:15:02.353896 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:02.353842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerStarted","Data":"e464e35da7213efeac6fb32c191e4bdf56e6b4e42fd7c9699e452cc167683cb1"} Apr 21 16:15:02.371513 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:02.371459 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" podStartSLOduration=1.2942253209999999 podStartE2EDuration="2.371445226s" podCreationTimestamp="2026-04-21 16:15:00 +0000 UTC" firstStartedPulling="2026-04-21 16:15:00.793098172 +0000 UTC m=+789.169279961" lastFinishedPulling="2026-04-21 16:15:01.870318072 +0000 UTC m=+790.246499866" observedRunningTime="2026-04-21 16:15:02.369173089 +0000 UTC m=+790.745354901" watchObservedRunningTime="2026-04-21 16:15:02.371445226 +0000 UTC m=+790.747627037" Apr 21 16:15:17.760258 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.760188 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5c85588b66-mgs2k"] Apr 21 16:15:17.763772 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.763752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:17.770951 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.770773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5c85588b66-mgs2k"] Apr 21 16:15:17.855685 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.855663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a8795a42-4648-43dc-aec9-984ac745e22c-tls-cert\") pod \"authorino-5c85588b66-mgs2k\" (UID: \"a8795a42-4648-43dc-aec9-984ac745e22c\") " pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:17.855802 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.855695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmts\" (UniqueName: \"kubernetes.io/projected/a8795a42-4648-43dc-aec9-984ac745e22c-kube-api-access-vhmts\") pod \"authorino-5c85588b66-mgs2k\" (UID: \"a8795a42-4648-43dc-aec9-984ac745e22c\") " pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:17.956487 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.956455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a8795a42-4648-43dc-aec9-984ac745e22c-tls-cert\") pod \"authorino-5c85588b66-mgs2k\" (UID: \"a8795a42-4648-43dc-aec9-984ac745e22c\") " pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:17.956596 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.956502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmts\" (UniqueName: \"kubernetes.io/projected/a8795a42-4648-43dc-aec9-984ac745e22c-kube-api-access-vhmts\") pod \"authorino-5c85588b66-mgs2k\" (UID: \"a8795a42-4648-43dc-aec9-984ac745e22c\") " pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:17.958744 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.958724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a8795a42-4648-43dc-aec9-984ac745e22c-tls-cert\") pod \"authorino-5c85588b66-mgs2k\" (UID: \"a8795a42-4648-43dc-aec9-984ac745e22c\") " pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:17.963484 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:17.963464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmts\" (UniqueName: \"kubernetes.io/projected/a8795a42-4648-43dc-aec9-984ac745e22c-kube-api-access-vhmts\") pod \"authorino-5c85588b66-mgs2k\" (UID: \"a8795a42-4648-43dc-aec9-984ac745e22c\") " pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:18.073716 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:18.073657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5c85588b66-mgs2k" Apr 21 16:15:18.201396 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:18.201365 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5c85588b66-mgs2k"] Apr 21 16:15:18.203075 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:15:18.203048 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8795a42_4648_43dc_aec9_984ac745e22c.slice/crio-6eef04d923c7caf80a700fb6d68cc812092db61b30c6d98fac1850e7ac8e7477 WatchSource:0}: Error finding container 6eef04d923c7caf80a700fb6d68cc812092db61b30c6d98fac1850e7ac8e7477: Status 404 returned error can't find the container with id 6eef04d923c7caf80a700fb6d68cc812092db61b30c6d98fac1850e7ac8e7477 Apr 21 16:15:18.418651 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:18.418619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5c85588b66-mgs2k" event={"ID":"a8795a42-4648-43dc-aec9-984ac745e22c","Type":"ContainerStarted","Data":"6eef04d923c7caf80a700fb6d68cc812092db61b30c6d98fac1850e7ac8e7477"} Apr 21 16:15:19.424185 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.424148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5c85588b66-mgs2k" event={"ID":"a8795a42-4648-43dc-aec9-984ac745e22c","Type":"ContainerStarted","Data":"9f4eb032bda6b937482e105d1acf0182dcb2af34a41cc6287b5581b3c6ca7aad"} Apr 21 16:15:19.443555 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.443510 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5c85588b66-mgs2k" podStartSLOduration=1.998144322 podStartE2EDuration="2.443500127s" podCreationTimestamp="2026-04-21 16:15:17 +0000 UTC" firstStartedPulling="2026-04-21 16:15:18.204354563 +0000 UTC m=+806.580536352" lastFinishedPulling="2026-04-21 16:15:18.649710367 +0000 UTC m=+807.025892157" observedRunningTime="2026-04-21 16:15:19.441637207 +0000 UTC m=+807.817819018" watchObservedRunningTime="2026-04-21 16:15:19.443500127 +0000 UTC m=+807.819681960" Apr 21 16:15:19.472437 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.472411 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57c6774d59-6tk24"] Apr 21 16:15:19.472612 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.472593 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-57c6774d59-6tk24" podUID="1708d9cf-8896-4a8c-be00-7d465c7eef29" containerName="authorino" containerID="cri-o://cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac" gracePeriod=30 Apr 21 16:15:19.716359 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.716340 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:15:19.774154 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.774125 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c5hk\" (UniqueName: \"kubernetes.io/projected/1708d9cf-8896-4a8c-be00-7d465c7eef29-kube-api-access-7c5hk\") pod \"1708d9cf-8896-4a8c-be00-7d465c7eef29\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " Apr 21 16:15:19.774280 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.774194 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1708d9cf-8896-4a8c-be00-7d465c7eef29-tls-cert\") pod \"1708d9cf-8896-4a8c-be00-7d465c7eef29\" (UID: \"1708d9cf-8896-4a8c-be00-7d465c7eef29\") " Apr 21 16:15:19.776270 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.776223 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1708d9cf-8896-4a8c-be00-7d465c7eef29-kube-api-access-7c5hk" (OuterVolumeSpecName: "kube-api-access-7c5hk") pod "1708d9cf-8896-4a8c-be00-7d465c7eef29" (UID: "1708d9cf-8896-4a8c-be00-7d465c7eef29"). InnerVolumeSpecName "kube-api-access-7c5hk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:15:19.785441 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.785420 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1708d9cf-8896-4a8c-be00-7d465c7eef29-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "1708d9cf-8896-4a8c-be00-7d465c7eef29" (UID: "1708d9cf-8896-4a8c-be00-7d465c7eef29"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:15:19.874919 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.874885 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7c5hk\" (UniqueName: \"kubernetes.io/projected/1708d9cf-8896-4a8c-be00-7d465c7eef29-kube-api-access-7c5hk\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:15:19.874919 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:19.874914 2576 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1708d9cf-8896-4a8c-be00-7d465c7eef29-tls-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:15:20.430294 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.430263 2576 generic.go:358] "Generic (PLEG): container finished" podID="1708d9cf-8896-4a8c-be00-7d465c7eef29" containerID="cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac" exitCode=0 Apr 21 16:15:20.430668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.430316 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57c6774d59-6tk24" Apr 21 16:15:20.430668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.430337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57c6774d59-6tk24" event={"ID":"1708d9cf-8896-4a8c-be00-7d465c7eef29","Type":"ContainerDied","Data":"cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac"} Apr 21 16:15:20.430668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.430371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57c6774d59-6tk24" event={"ID":"1708d9cf-8896-4a8c-be00-7d465c7eef29","Type":"ContainerDied","Data":"70bda40205a4841dc67faba7450be2fa029314d0cd59277fa504c0a804050fb8"} Apr 21 16:15:20.430668 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.430390 2576 scope.go:117] "RemoveContainer" containerID="cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac" Apr 21 16:15:20.439326 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.439311 2576 scope.go:117] "RemoveContainer" containerID="cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac" Apr 21 16:15:20.439545 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:15:20.439530 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac\": container with ID starting with cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac not found: ID does not exist" containerID="cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac" Apr 21 16:15:20.439592 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.439552 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac"} err="failed to get container status \"cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac\": rpc error: code = NotFound desc = could not find container \"cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac\": container with ID starting with cb5f884bc018b41e29f78808bdb9d34fcba01abf4849895c4c7b3f176ed0c4ac not found: ID does not exist" Apr 21 16:15:20.448767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.448747 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57c6774d59-6tk24"] Apr 21 16:15:20.458222 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:20.455633 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-57c6774d59-6tk24"] Apr 21 16:15:22.254526 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:22.254493 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1708d9cf-8896-4a8c-be00-7d465c7eef29" path="/var/lib/kubelet/pods/1708d9cf-8896-4a8c-be00-7d465c7eef29/volumes" Apr 21 16:15:23.447982 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:23.447948 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerID="e464e35da7213efeac6fb32c191e4bdf56e6b4e42fd7c9699e452cc167683cb1" exitCode=6 Apr 21 16:15:23.448526 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:23.448026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerDied","Data":"e464e35da7213efeac6fb32c191e4bdf56e6b4e42fd7c9699e452cc167683cb1"} Apr 21 16:15:23.448526 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:23.448314 2576 scope.go:117] "RemoveContainer" containerID="e464e35da7213efeac6fb32c191e4bdf56e6b4e42fd7c9699e452cc167683cb1" Apr 21 16:15:24.458672 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:24.458592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerStarted","Data":"270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b"} Apr 21 16:15:44.538184 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:44.538151 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerID="270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b" exitCode=6 Apr 21 16:15:44.538604 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:44.538221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerDied","Data":"270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b"} Apr 21 16:15:44.538604 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:44.538264 2576 scope.go:117] "RemoveContainer" containerID="e464e35da7213efeac6fb32c191e4bdf56e6b4e42fd7c9699e452cc167683cb1" Apr 21 16:15:44.538604 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:44.538551 2576 scope.go:117] "RemoveContainer" containerID="270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b" Apr 21 16:15:44.538798 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:15:44.538777 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29613135-bjzmf_opendatahub(fc660a0a-dff3-45f7-96ac-7c4ae7a51503)\"" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" Apr 21 16:15:55.247611 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:55.247583 2576 scope.go:117] "RemoveContainer" containerID="270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b" Apr 21 16:15:55.585821 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:55.585791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerStarted","Data":"47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246"} Apr 21 16:15:56.276578 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:56.276541 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613135-bjzmf"] Apr 21 16:15:56.591825 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:15:56.591643 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" containerID="cri-o://47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246" gracePeriod=30 Apr 21 16:16:16.128292 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.128270 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" Apr 21 16:16:16.158176 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.157518 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkw7b\" (UniqueName: \"kubernetes.io/projected/fc660a0a-dff3-45f7-96ac-7c4ae7a51503-kube-api-access-tkw7b\") pod \"fc660a0a-dff3-45f7-96ac-7c4ae7a51503\" (UID: \"fc660a0a-dff3-45f7-96ac-7c4ae7a51503\") " Apr 21 16:16:16.160621 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.160537 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc660a0a-dff3-45f7-96ac-7c4ae7a51503-kube-api-access-tkw7b" (OuterVolumeSpecName: "kube-api-access-tkw7b") pod "fc660a0a-dff3-45f7-96ac-7c4ae7a51503" (UID: "fc660a0a-dff3-45f7-96ac-7c4ae7a51503"). InnerVolumeSpecName "kube-api-access-tkw7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:16:16.259010 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.258986 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkw7b\" (UniqueName: \"kubernetes.io/projected/fc660a0a-dff3-45f7-96ac-7c4ae7a51503-kube-api-access-tkw7b\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:16:16.674363 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.674326 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerID="47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246" exitCode=6 Apr 21 16:16:16.674535 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.674392 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" Apr 21 16:16:16.674535 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.674406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerDied","Data":"47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246"} Apr 21 16:16:16.674535 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.674443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613135-bjzmf" event={"ID":"fc660a0a-dff3-45f7-96ac-7c4ae7a51503","Type":"ContainerDied","Data":"e01519acedfcdba92f5b5e4c9cafc834d28d43f5c826db1f8403f5de80892637"} Apr 21 16:16:16.674535 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.674459 2576 scope.go:117] "RemoveContainer" containerID="47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246" Apr 21 16:16:16.684025 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.683995 2576 scope.go:117] "RemoveContainer" containerID="270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b" Apr 21 16:16:16.692398 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.692370 2576 scope.go:117] "RemoveContainer" containerID="47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246" Apr 21 16:16:16.692646 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:16:16.692624 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246\": container with ID starting with 47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246 not found: ID does not exist" containerID="47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246" Apr 21 16:16:16.692736 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.692657 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246"} err="failed to get container status \"47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246\": rpc error: code = NotFound desc = could not find container \"47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246\": container with ID starting with 47fa8ea35bca720ca490420addf17345c4c2176f802f9c61f77221f8bd500246 not found: ID does not exist" Apr 21 16:16:16.692736 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.692684 2576 scope.go:117] "RemoveContainer" containerID="270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b" Apr 21 16:16:16.693035 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:16:16.693018 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b\": container with ID starting with 270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b not found: ID does not exist" containerID="270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b" Apr 21 16:16:16.693104 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.693043 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b"} err="failed to get container status \"270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b\": rpc error: code = NotFound desc = could not find container \"270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b\": container with ID starting with 270ed8f549d1b9a3116edbc1d943c74d1dd34bb7cd0cd220876d9436174c401b not found: ID does not exist" Apr 21 16:16:16.695349 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.695330 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613135-bjzmf"] Apr 21 16:16:16.699749 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:16.699728 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613135-bjzmf"] Apr 21 16:16:18.253231 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:18.253198 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" path="/var/lib/kubelet/pods/fc660a0a-dff3-45f7-96ac-7c4ae7a51503/volumes" Apr 21 16:16:42.819263 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819189 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-548ffc956-qbx9t"] Apr 21 16:16:42.819762 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819740 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.819879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819766 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.819879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819787 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.819879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819795 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.819879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819810 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.819879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819819 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.820053 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819885 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1708d9cf-8896-4a8c-be00-7d465c7eef29" containerName="authorino" Apr 21 16:16:42.820053 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.819895 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1708d9cf-8896-4a8c-be00-7d465c7eef29" containerName="authorino" Apr 21 16:16:42.820053 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.820016 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.820053 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.820033 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:16:42.820053 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.820042 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1708d9cf-8896-4a8c-be00-7d465c7eef29" containerName="authorino" Apr 21 16:16:42.824974 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.824957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-548ffc956-qbx9t" Apr 21 16:16:42.827722 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.827688 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hr5hm\"" Apr 21 16:16:42.830765 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.830740 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-548ffc956-qbx9t"] Apr 21 16:16:42.879725 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.879704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwzh4\" (UniqueName: \"kubernetes.io/projected/effcc86e-c5d4-4c78-8b70-066f0aefb563-kube-api-access-wwzh4\") pod \"maas-controller-548ffc956-qbx9t\" (UID: \"effcc86e-c5d4-4c78-8b70-066f0aefb563\") " pod="opendatahub/maas-controller-548ffc956-qbx9t" Apr 21 16:16:42.981052 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.981028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwzh4\" (UniqueName: \"kubernetes.io/projected/effcc86e-c5d4-4c78-8b70-066f0aefb563-kube-api-access-wwzh4\") pod \"maas-controller-548ffc956-qbx9t\" (UID: \"effcc86e-c5d4-4c78-8b70-066f0aefb563\") " pod="opendatahub/maas-controller-548ffc956-qbx9t" Apr 21 16:16:42.989191 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:42.989167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwzh4\" (UniqueName: \"kubernetes.io/projected/effcc86e-c5d4-4c78-8b70-066f0aefb563-kube-api-access-wwzh4\") pod \"maas-controller-548ffc956-qbx9t\" (UID: \"effcc86e-c5d4-4c78-8b70-066f0aefb563\") " pod="opendatahub/maas-controller-548ffc956-qbx9t" Apr 21 16:16:43.161478 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:43.161446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-548ffc956-qbx9t" Apr 21 16:16:43.491320 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:43.491294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-548ffc956-qbx9t"] Apr 21 16:16:43.493312 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:16:43.493287 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeffcc86e_c5d4_4c78_8b70_066f0aefb563.slice/crio-59a85e86c5d8369ece8cb369d5f51a6fdd32f782338a907be0e8b29b4f0e542d WatchSource:0}: Error finding container 59a85e86c5d8369ece8cb369d5f51a6fdd32f782338a907be0e8b29b4f0e542d: Status 404 returned error can't find the container with id 59a85e86c5d8369ece8cb369d5f51a6fdd32f782338a907be0e8b29b4f0e542d Apr 21 16:16:43.795358 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:43.795274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-548ffc956-qbx9t" event={"ID":"effcc86e-c5d4-4c78-8b70-066f0aefb563","Type":"ContainerStarted","Data":"59a85e86c5d8369ece8cb369d5f51a6fdd32f782338a907be0e8b29b4f0e542d"} Apr 21 16:16:44.800790 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:44.800753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-548ffc956-qbx9t" event={"ID":"effcc86e-c5d4-4c78-8b70-066f0aefb563","Type":"ContainerStarted","Data":"35543dbf5594c9e1e2ce71e221ff44b6c68a2ccaf9ca13d4f75b6d1fcafa2dde"} Apr 21 16:16:44.801312 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:44.800804 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-548ffc956-qbx9t" Apr 21 16:16:44.818878 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:44.816984 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-548ffc956-qbx9t" podStartSLOduration=2.296606686 podStartE2EDuration="2.816960594s" podCreationTimestamp="2026-04-21 16:16:42 +0000 UTC" firstStartedPulling="2026-04-21 16:16:43.495051956 +0000 UTC m=+891.871233746" lastFinishedPulling="2026-04-21 16:16:44.015405866 +0000 UTC m=+892.391587654" observedRunningTime="2026-04-21 16:16:44.816141038 +0000 UTC m=+893.192322850" watchObservedRunningTime="2026-04-21 16:16:44.816960594 +0000 UTC m=+893.193142406" Apr 21 16:16:52.203709 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:52.203676 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:16:52.205514 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:52.205493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:16:55.810405 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:16:55.810375 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-548ffc956-qbx9t" Apr 21 16:21:52.243181 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:21:52.243151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:21:52.249082 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:21:52.249061 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:26:52.282461 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:26:52.282431 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:26:52.288633 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:26:52.288610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:30:00.151336 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.151298 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613150-9dzpl"] Apr 21 16:30:00.151816 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.151767 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc660a0a-dff3-45f7-96ac-7c4ae7a51503" containerName="cleanup" Apr 21 16:30:00.154891 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.154851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" Apr 21 16:30:00.157580 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.157557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hhkxk\"" Apr 21 16:30:00.180266 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.180236 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613150-9dzpl"] Apr 21 16:30:00.278234 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.278198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw5df\" (UniqueName: \"kubernetes.io/projected/2641edfb-a299-46c3-bb5d-3899959b7cd7-kube-api-access-cw5df\") pod \"maas-api-key-cleanup-29613150-9dzpl\" (UID: \"2641edfb-a299-46c3-bb5d-3899959b7cd7\") " pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" Apr 21 16:30:00.379337 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.379303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw5df\" (UniqueName: \"kubernetes.io/projected/2641edfb-a299-46c3-bb5d-3899959b7cd7-kube-api-access-cw5df\") pod \"maas-api-key-cleanup-29613150-9dzpl\" (UID: \"2641edfb-a299-46c3-bb5d-3899959b7cd7\") " pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" Apr 21 16:30:00.388739 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.388710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw5df\" (UniqueName: \"kubernetes.io/projected/2641edfb-a299-46c3-bb5d-3899959b7cd7-kube-api-access-cw5df\") pod \"maas-api-key-cleanup-29613150-9dzpl\" (UID: \"2641edfb-a299-46c3-bb5d-3899959b7cd7\") " pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" Apr 21 16:30:00.465332 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.465236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" Apr 21 16:30:00.592652 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.592627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613150-9dzpl"] Apr 21 16:30:00.594746 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:30:00.594719 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2641edfb_a299_46c3_bb5d_3899959b7cd7.slice/crio-515e4fa69298c487b50e26aa522405e93dcd7d505f60bd8d52cc938156b4d8f4 WatchSource:0}: Error finding container 515e4fa69298c487b50e26aa522405e93dcd7d505f60bd8d52cc938156b4d8f4: Status 404 returned error can't find the container with id 515e4fa69298c487b50e26aa522405e93dcd7d505f60bd8d52cc938156b4d8f4 Apr 21 16:30:00.596929 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:00.596910 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:30:01.045379 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:01.045344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerStarted","Data":"60f7a3a42545014cb01440688fb401cafdd98cd6bbc74b9c088928cd38f18a0c"} Apr 21 16:30:01.045379 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:01.045381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerStarted","Data":"515e4fa69298c487b50e26aa522405e93dcd7d505f60bd8d52cc938156b4d8f4"} Apr 21 16:30:22.139840 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:22.139799 2576 generic.go:358] "Generic (PLEG): container finished" podID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerID="60f7a3a42545014cb01440688fb401cafdd98cd6bbc74b9c088928cd38f18a0c" exitCode=6 Apr 21 16:30:22.140313 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:22.139884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerDied","Data":"60f7a3a42545014cb01440688fb401cafdd98cd6bbc74b9c088928cd38f18a0c"} Apr 21 16:30:22.140313 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:22.140275 2576 scope.go:117] "RemoveContainer" containerID="60f7a3a42545014cb01440688fb401cafdd98cd6bbc74b9c088928cd38f18a0c" Apr 21 16:30:23.147284 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:23.147243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerStarted","Data":"79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e"} Apr 21 16:30:23.171474 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:23.171412 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" podStartSLOduration=23.171396661 podStartE2EDuration="23.171396661s" podCreationTimestamp="2026-04-21 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:30:01.079419744 +0000 UTC m=+1689.455601555" watchObservedRunningTime="2026-04-21 16:30:23.171396661 +0000 UTC m=+1711.547578472" Apr 21 16:30:43.233331 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:43.233296 2576 generic.go:358] "Generic (PLEG): container finished" podID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerID="79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e" exitCode=6 Apr 21 16:30:43.233796 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:43.233375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerDied","Data":"79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e"} Apr 21 16:30:43.233796 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:43.233435 2576 scope.go:117] "RemoveContainer" containerID="60f7a3a42545014cb01440688fb401cafdd98cd6bbc74b9c088928cd38f18a0c" Apr 21 16:30:43.233946 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:43.233830 2576 scope.go:117] "RemoveContainer" containerID="79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e" Apr 21 16:30:43.234115 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:30:43.234095 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29613150-9dzpl_opendatahub(2641edfb-a299-46c3-bb5d-3899959b7cd7)\"" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" Apr 21 16:30:57.248327 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:57.248287 2576 scope.go:117] "RemoveContainer" containerID="79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e" Apr 21 16:30:58.298557 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:58.298523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerStarted","Data":"0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb"} Apr 21 16:30:59.346670 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:59.346635 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613150-9dzpl"] Apr 21 16:30:59.347117 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:30:59.346850 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerName="cleanup" containerID="cri-o://0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb" gracePeriod=30 Apr 21 16:31:17.990276 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:17.990251 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" Apr 21 16:31:18.099798 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.099704 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw5df\" (UniqueName: \"kubernetes.io/projected/2641edfb-a299-46c3-bb5d-3899959b7cd7-kube-api-access-cw5df\") pod \"2641edfb-a299-46c3-bb5d-3899959b7cd7\" (UID: \"2641edfb-a299-46c3-bb5d-3899959b7cd7\") " Apr 21 16:31:18.101627 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.101605 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2641edfb-a299-46c3-bb5d-3899959b7cd7-kube-api-access-cw5df" (OuterVolumeSpecName: "kube-api-access-cw5df") pod "2641edfb-a299-46c3-bb5d-3899959b7cd7" (UID: "2641edfb-a299-46c3-bb5d-3899959b7cd7"). InnerVolumeSpecName "kube-api-access-cw5df". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:31:18.200306 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.200284 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cw5df\" (UniqueName: \"kubernetes.io/projected/2641edfb-a299-46c3-bb5d-3899959b7cd7-kube-api-access-cw5df\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 21 16:31:18.378356 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.378280 2576 generic.go:358] "Generic (PLEG): container finished" podID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerID="0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb" exitCode=6 Apr 21 16:31:18.378356 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.378344 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" Apr 21 16:31:18.378521 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.378362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerDied","Data":"0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb"} Apr 21 16:31:18.378521 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.378408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613150-9dzpl" event={"ID":"2641edfb-a299-46c3-bb5d-3899959b7cd7","Type":"ContainerDied","Data":"515e4fa69298c487b50e26aa522405e93dcd7d505f60bd8d52cc938156b4d8f4"} Apr 21 16:31:18.378521 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.378424 2576 scope.go:117] "RemoveContainer" containerID="0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb" Apr 21 16:31:18.389584 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.389561 2576 scope.go:117] "RemoveContainer" containerID="79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e" Apr 21 16:31:18.397664 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.397637 2576 scope.go:117] "RemoveContainer" containerID="0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb" Apr 21 16:31:18.397995 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:31:18.397973 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb\": container with ID starting with 0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb not found: ID does not exist" containerID="0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb" Apr 21 16:31:18.398085 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.398008 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb"} err="failed to get container status \"0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb\": rpc error: code = NotFound desc = could not find container \"0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb\": container with ID starting with 0da339ea7615f898a58e6da8fddf269aabe744d25715799d9b19307bba5d6ceb not found: ID does not exist" Apr 21 16:31:18.398085 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.398031 2576 scope.go:117] "RemoveContainer" containerID="79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e" Apr 21 16:31:18.398295 ip-10-0-138-191 kubenswrapper[2576]: E0421 16:31:18.398278 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e\": container with ID starting with 79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e not found: ID does not exist" containerID="79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e" Apr 21 16:31:18.398337 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.398300 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e"} err="failed to get container status \"79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e\": rpc error: code = NotFound desc = could not find container \"79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e\": container with ID starting with 79bda3b449067ebcae373137842e9ff641d1cae5ceee8ef6cc49acc01185bd9e not found: ID does not exist" Apr 21 16:31:18.399699 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.399678 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613150-9dzpl"] Apr 21 16:31:18.403670 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:18.403651 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613150-9dzpl"] Apr 21 16:31:20.253026 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:20.252994 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" path="/var/lib/kubelet/pods/2641edfb-a299-46c3-bb5d-3899959b7cd7/volumes" Apr 21 16:31:52.317197 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:52.317101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:31:52.324603 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:31:52.324586 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:36:52.353141 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:36:52.353046 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:36:52.365031 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:36:52.365012 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7dm6j_eb33d6a3-42f4-4598-8722-83e1ccb59eae/ovn-acl-logging/0.log" Apr 21 16:37:38.749227 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:38.749195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5c85588b66-mgs2k_a8795a42-4648-43dc-aec9-984ac745e22c/authorino/0.log" Apr 21 16:37:42.710090 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:42.710015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-548ffc956-qbx9t_effcc86e-c5d4-4c78-8b70-066f0aefb563/manager/0.log" Apr 21 16:37:43.164336 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:43.164311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-774f54dc87-vqs87_8b028677-6d59-4d75-9eb8-5758225d6d88/manager/0.log" Apr 21 16:37:43.268673 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:43.268644 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-xjfcw_fda97a47-c138-4a75-b145-85d16968d1eb/postgres/0.log" Apr 21 16:37:44.030813 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.030786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p_cb431080-b815-441e-ae40-a7cbb06c00e8/util/0.log" Apr 21 16:37:44.040613 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.040593 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p_cb431080-b815-441e-ae40-a7cbb06c00e8/pull/0.log" Apr 21 16:37:44.049354 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.049335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p_cb431080-b815-441e-ae40-a7cbb06c00e8/extract/0.log" Apr 21 16:37:44.156124 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.156102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9_514becdf-1310-4620-9c42-c349a3b1359a/util/0.log" Apr 21 16:37:44.164214 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.164198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9_514becdf-1310-4620-9c42-c349a3b1359a/pull/0.log" Apr 21 16:37:44.170386 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.170370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9_514becdf-1310-4620-9c42-c349a3b1359a/extract/0.log" Apr 21 16:37:44.275599 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.275582 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9_3bd6de24-3b32-4014-828d-44abd752467b/extract/0.log" Apr 21 16:37:44.282892 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.282830 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9_3bd6de24-3b32-4014-828d-44abd752467b/util/0.log" Apr 21 16:37:44.291564 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.291549 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9_3bd6de24-3b32-4014-828d-44abd752467b/pull/0.log" Apr 21 16:37:44.401322 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.401298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5_b187f500-8a83-41c6-8672-8216c305049d/util/0.log" Apr 21 16:37:44.409043 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.409025 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5_b187f500-8a83-41c6-8672-8216c305049d/pull/0.log" Apr 21 16:37:44.418508 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.418491 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5_b187f500-8a83-41c6-8672-8216c305049d/extract/0.log" Apr 21 16:37:44.540777 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.540725 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5c85588b66-mgs2k_a8795a42-4648-43dc-aec9-984ac745e22c/authorino/0.log" Apr 21 16:37:44.775761 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.775741 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-krh7c_76de666e-091c-4605-a1b1-836dd44b381b/manager/0.log" Apr 21 16:37:44.995561 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:44.995535 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-jlgz6_008018a6-2152-4abe-b3dc-0fca25d4db0d/registry-server/0.log" Apr 21 16:37:45.343879 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:45.343795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-ldfw7_7c5e7d02-3e1c-4afb-921c-06c00262a694/manager/0.log" Apr 21 16:37:45.676439 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:45.676415 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft_50c064db-1acb-40aa-87af-19f89d6d568c/istio-proxy/0.log" Apr 21 16:37:46.143258 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:46.143228 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-j9mdp_3a8f8c79-05b9-4e1f-a208-dca73b68e53b/istio-proxy/0.log" Apr 21 16:37:46.263184 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:46.263160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5d9c9b646-wbgwm_60362f96-0ab5-45f5-b480-008e39e6ac85/router/0.log" Apr 21 16:37:46.586052 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:46.585972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2_9be06ddc-fd85-47f2-ad11-52c616571953/storage-initializer/0.log" Apr 21 16:37:46.592814 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:46.592797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-cphg2_9be06ddc-fd85-47f2-ad11-52c616571953/main/0.log" Apr 21 16:37:46.933767 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:46.933740 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2_74be8ffa-3171-4ff2-883d-28515c021095/storage-initializer/0.log" Apr 21 16:37:46.940280 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:46.940259 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdtpb2_74be8ffa-3171-4ff2-883d-28515c021095/main/0.log" Apr 21 16:37:47.157704 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:47.157673 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s_d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee/main/0.log" Apr 21 16:37:47.164658 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:47.164633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-lhp9s_d7d24cbb-4859-4e5c-b04e-64ff7d22b3ee/storage-initializer/0.log" Apr 21 16:37:53.875448 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:53.875419 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fh7mj_ac5d67c0-6512-49e3-99ae-4ea362c0dbe3/global-pull-secret-syncer/0.log" Apr 21 16:37:54.059912 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:54.059840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6gg6g_81b42d6a-976a-40b8-887e-6faf9b28d9b4/konnectivity-agent/0.log" Apr 21 16:37:54.204123 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:54.204096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-191.ec2.internal_09a45ec1566c454073ee33f001f99f61/haproxy/0.log" Apr 21 16:37:58.082895 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.082871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p_cb431080-b815-441e-ae40-a7cbb06c00e8/extract/0.log" Apr 21 16:37:58.107738 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.107711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p_cb431080-b815-441e-ae40-a7cbb06c00e8/util/0.log" Apr 21 16:37:58.129783 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.129760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75946r7p_cb431080-b815-441e-ae40-a7cbb06c00e8/pull/0.log" Apr 21 16:37:58.160173 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.160155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9_514becdf-1310-4620-9c42-c349a3b1359a/extract/0.log" Apr 21 16:37:58.182320 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.182297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9_514becdf-1310-4620-9c42-c349a3b1359a/util/0.log" Apr 21 16:37:58.206219 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.206192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e064wc9_514becdf-1310-4620-9c42-c349a3b1359a/pull/0.log" Apr 21 16:37:58.231829 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.231813 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9_3bd6de24-3b32-4014-828d-44abd752467b/extract/0.log" Apr 21 16:37:58.253403 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.253386 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9_3bd6de24-3b32-4014-828d-44abd752467b/util/0.log" Apr 21 16:37:58.274556 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.274535 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732w4h9_3bd6de24-3b32-4014-828d-44abd752467b/pull/0.log" Apr 21 16:37:58.311180 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.311154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5_b187f500-8a83-41c6-8672-8216c305049d/extract/0.log" Apr 21 16:37:58.334011 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.333953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5_b187f500-8a83-41c6-8672-8216c305049d/util/0.log" Apr 21 16:37:58.358602 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.358580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17z7j5_b187f500-8a83-41c6-8672-8216c305049d/pull/0.log" Apr 21 16:37:58.538984 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.538952 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5c85588b66-mgs2k_a8795a42-4648-43dc-aec9-984ac745e22c/authorino/0.log" Apr 21 16:37:58.606939 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.606869 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-krh7c_76de666e-091c-4605-a1b1-836dd44b381b/manager/0.log" Apr 21 16:37:58.684945 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.684921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-jlgz6_008018a6-2152-4abe-b3dc-0fca25d4db0d/registry-server/0.log" Apr 21 16:37:58.915394 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:37:58.915367 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-ldfw7_7c5e7d02-3e1c-4afb-921c-06c00262a694/manager/0.log" Apr 21 16:38:00.811263 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:00.811233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-wtt24_d873a023-c5df-43cd-9008-fb0ac22d99a0/cluster-monitoring-operator/0.log" Apr 21 16:38:00.838027 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:00.837994 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jqg7t_3b732a56-615f-485f-8b03-9e7618e45b3a/kube-state-metrics/0.log" Apr 21 16:38:00.868939 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:00.868919 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jqg7t_3b732a56-615f-485f-8b03-9e7618e45b3a/kube-rbac-proxy-main/0.log" Apr 21 16:38:00.913258 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:00.913236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jqg7t_3b732a56-615f-485f-8b03-9e7618e45b3a/kube-rbac-proxy-self/0.log" Apr 21 16:38:00.985339 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:00.985313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-dqm5k_83af9817-94a0-4dbd-a6b9-f7220da08c3b/monitoring-plugin/0.log" Apr 21 16:38:01.184977 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.184955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzhmc_4f95fc4b-4e03-4d17-9691-dd89e4ec7182/node-exporter/0.log" Apr 21 16:38:01.208425 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.208406 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzhmc_4f95fc4b-4e03-4d17-9691-dd89e4ec7182/kube-rbac-proxy/0.log" Apr 21 16:38:01.230569 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.230552 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzhmc_4f95fc4b-4e03-4d17-9691-dd89e4ec7182/init-textfile/0.log" Apr 21 16:38:01.260191 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.260175 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vc9j4_696488ce-f740-4074-a939-4496126afadd/kube-rbac-proxy-main/0.log" Apr 21 16:38:01.282660 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.282642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vc9j4_696488ce-f740-4074-a939-4496126afadd/kube-rbac-proxy-self/0.log" Apr 21 16:38:01.303381 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.303356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vc9j4_696488ce-f740-4074-a939-4496126afadd/openshift-state-metrics/0.log" Apr 21 16:38:01.507156 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.507106 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-b9rvv_9d58a911-4968-4350-90b2-66ae9a65496d/prometheus-operator/0.log" Apr 21 16:38:01.526072 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.526046 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-b9rvv_9d58a911-4968-4350-90b2-66ae9a65496d/kube-rbac-proxy/0.log" Apr 21 16:38:01.598308 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.598280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f44d4dcfb-t2r9p_f0d1e899-28ff-427b-87d3-5ad83798f78d/telemeter-client/0.log" Apr 21 16:38:01.626233 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.626203 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f44d4dcfb-t2r9p_f0d1e899-28ff-427b-87d3-5ad83798f78d/reload/0.log" Apr 21 16:38:01.655681 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.655663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f44d4dcfb-t2r9p_f0d1e899-28ff-427b-87d3-5ad83798f78d/kube-rbac-proxy/0.log" Apr 21 16:38:01.696993 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.696973 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c9b5b6fd8-fz645_99da288f-71c1-4f4b-8101-b0a009cb6add/thanos-query/0.log" Apr 21 16:38:01.718046 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.718030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c9b5b6fd8-fz645_99da288f-71c1-4f4b-8101-b0a009cb6add/kube-rbac-proxy-web/0.log" Apr 21 16:38:01.739676 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.739658 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c9b5b6fd8-fz645_99da288f-71c1-4f4b-8101-b0a009cb6add/kube-rbac-proxy/0.log" Apr 21 16:38:01.759870 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.759814 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c9b5b6fd8-fz645_99da288f-71c1-4f4b-8101-b0a009cb6add/prom-label-proxy/0.log" Apr 21 16:38:01.779716 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.779701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c9b5b6fd8-fz645_99da288f-71c1-4f4b-8101-b0a009cb6add/kube-rbac-proxy-rules/0.log" Apr 21 16:38:01.803549 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:01.803531 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c9b5b6fd8-fz645_99da288f-71c1-4f4b-8101-b0a009cb6add/kube-rbac-proxy-metrics/0.log" Apr 21 16:38:02.152107 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.152079 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn"] Apr 21 16:38:02.152457 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.152445 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerName="cleanup" Apr 21 16:38:02.152514 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.152458 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerName="cleanup" Apr 21 16:38:02.152514 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.152481 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerName="cleanup" Apr 21 16:38:02.152514 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.152486 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerName="cleanup" Apr 21 16:38:02.152616 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.152547 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerName="cleanup" Apr 21 16:38:02.152616 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.152555 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2641edfb-a299-46c3-bb5d-3899959b7cd7" containerName="cleanup" Apr 21 16:38:02.155692 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.155677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.158595 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.158576 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zttst\"/\"openshift-service-ca.crt\"" Apr 21 16:38:02.159325 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.159241 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zttst\"/\"kube-root-ca.crt\"" Apr 21 16:38:02.159715 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.159700 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zttst\"/\"default-dockercfg-568sq\"" Apr 21 16:38:02.167521 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.167503 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn"] Apr 21 16:38:02.338323 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.338298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89gtp\" (UniqueName: \"kubernetes.io/projected/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-kube-api-access-89gtp\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.338428 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.338331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-proc\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.338428 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.338348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-lib-modules\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.338428 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.338382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-sys\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.338428 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.338420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-podres\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.439694 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89gtp\" (UniqueName: \"kubernetes.io/projected/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-kube-api-access-89gtp\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.439694 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-proc\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.439694 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-lib-modules\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.440029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-proc\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.440029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-lib-modules\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.440029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-sys\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.440029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-sys\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.440029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-podres\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.440029 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.439919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-podres\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.451472 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.451433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89gtp\" (UniqueName: \"kubernetes.io/projected/783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf-kube-api-access-89gtp\") pod \"perf-node-gather-daemonset-8fsmn\" (UID: \"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.467342 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.467321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:02.603161 ip-10-0-138-191 kubenswrapper[2576]: W0421 16:38:02.603128 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod783d9f2a_fd9f_4ecc_be27_2b2c8a59d5bf.slice/crio-4a047af911877b45361efb8081bff87dcd0034c8a9d1a29400dddb3a63f47551 WatchSource:0}: Error finding container 4a047af911877b45361efb8081bff87dcd0034c8a9d1a29400dddb3a63f47551: Status 404 returned error can't find the container with id 4a047af911877b45361efb8081bff87dcd0034c8a9d1a29400dddb3a63f47551 Apr 21 16:38:02.603656 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.603636 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn"] Apr 21 16:38:02.604870 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.604841 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:38:02.891375 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:02.891348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-g95tn_ca93eccd-f4c5-4bcf-a5f3-7521b23a86d4/networking-console-plugin/0.log" Apr 21 16:38:03.004982 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:03.004898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" event={"ID":"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf","Type":"ContainerStarted","Data":"db133a0bf6f9c0bb7b133b3e755434af29e2a3dcb7d7cf2744f71b539e215901"} Apr 21 16:38:03.004982 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:03.004938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" event={"ID":"783d9f2a-fd9f-4ecc-be27-2b2c8a59d5bf","Type":"ContainerStarted","Data":"4a047af911877b45361efb8081bff87dcd0034c8a9d1a29400dddb3a63f47551"} Apr 21 16:38:03.005173 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:03.005003 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:03.035346 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:03.035301 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" podStartSLOduration=1.03528832 podStartE2EDuration="1.03528832s" podCreationTimestamp="2026-04-21 16:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:38:03.033138764 +0000 UTC m=+2171.409320573" watchObservedRunningTime="2026-04-21 16:38:03.03528832 +0000 UTC m=+2171.411470129" Apr 21 16:38:03.990277 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:03.990247 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b95999c84-9vhn4_20353ddb-2c96-48a6-a3df-7d1daa3ad80f/console/0.log" Apr 21 16:38:05.307693 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:05.307664 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ffpt6_b713efcd-8ac3-49bc-a307-d7919be3e7ba/dns/0.log" Apr 21 16:38:05.328406 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:05.328381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ffpt6_b713efcd-8ac3-49bc-a307-d7919be3e7ba/kube-rbac-proxy/0.log" Apr 21 16:38:05.479089 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:05.479064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qdd9v_3debd313-0085-4668-87ee-27bfcceec0f2/dns-node-resolver/0.log" Apr 21 16:38:06.006958 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:06.006932 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r68ws_0983fe40-299f-4402-9953-2982f995ea8f/node-ca/0.log" Apr 21 16:38:06.845922 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:06.845891 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfbv6ft_50c064db-1acb-40aa-87af-19f89d6d568c/istio-proxy/0.log" Apr 21 16:38:07.117313 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:07.117237 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-j9mdp_3a8f8c79-05b9-4e1f-a208-dca73b68e53b/istio-proxy/0.log" Apr 21 16:38:07.138631 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:07.138608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5d9c9b646-wbgwm_60362f96-0ab5-45f5-b480-008e39e6ac85/router/0.log" Apr 21 16:38:07.688098 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:07.688069 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rsggj_e01c7fac-d92f-4341-ad18-e2c502d62462/serve-healthcheck-canary/0.log" Apr 21 16:38:08.199098 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:08.199073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rtv56_7da6b8da-7bb2-4107-8411-5fe4591aa3d3/kube-rbac-proxy/0.log" Apr 21 16:38:08.219478 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:08.219456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rtv56_7da6b8da-7bb2-4107-8411-5fe4591aa3d3/exporter/0.log" Apr 21 16:38:08.238931 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:08.238911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rtv56_7da6b8da-7bb2-4107-8411-5fe4591aa3d3/extractor/0.log" Apr 21 16:38:09.021815 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:09.021791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-8fsmn" Apr 21 16:38:10.348629 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:10.348595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-548ffc956-qbx9t_effcc86e-c5d4-4c78-8b70-066f0aefb563/manager/0.log" Apr 21 16:38:10.532660 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:10.532632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-774f54dc87-vqs87_8b028677-6d59-4d75-9eb8-5758225d6d88/manager/0.log" Apr 21 16:38:10.572717 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:10.572696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-xjfcw_fda97a47-c138-4a75-b145-85d16968d1eb/postgres/0.log" Apr 21 16:38:12.004042 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:12.003959 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-d98d6987c-5clzh_597d1320-fab9-47a1-8230-333f5695b010/manager/0.log" Apr 21 16:38:18.267123 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.267094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rxw6j_c5c002ed-ea31-4b63-a340-208a5f1f28e3/kube-multus-additional-cni-plugins/0.log" Apr 21 16:38:18.297194 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.297175 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rxw6j_c5c002ed-ea31-4b63-a340-208a5f1f28e3/egress-router-binary-copy/0.log" Apr 21 16:38:18.318140 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.318116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rxw6j_c5c002ed-ea31-4b63-a340-208a5f1f28e3/cni-plugins/0.log" Apr 21 16:38:18.337530 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.337509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rxw6j_c5c002ed-ea31-4b63-a340-208a5f1f28e3/bond-cni-plugin/0.log" Apr 21 16:38:18.359146 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.359130 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rxw6j_c5c002ed-ea31-4b63-a340-208a5f1f28e3/routeoverride-cni/0.log" Apr 21 16:38:18.380290 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.380268 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rxw6j_c5c002ed-ea31-4b63-a340-208a5f1f28e3/whereabouts-cni-bincopy/0.log" Apr 21 16:38:18.404284 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.404262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rxw6j_c5c002ed-ea31-4b63-a340-208a5f1f28e3/whereabouts-cni/0.log" Apr 21 16:38:18.483241 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.483224 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hcrxk_4ecc38cb-25f5-4bc0-ae83-f6897b537b0a/kube-multus/0.log" Apr 21 16:38:18.520429 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.520378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-clkll_b782d8e0-5d5e-4897-91b4-49f7049c7fac/network-metrics-daemon/0.log" Apr 21 16:38:18.544897 ip-10-0-138-191 kubenswrapper[2576]: I0421 16:38:18.544877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-clkll_b782d8e0-5d5e-4897-91b4-49f7049c7fac/kube-rbac-proxy/0.log"