./mage -v ci:teste2e go: downloading golang.org/x/sys v0.32.0 Running target: CI:TestE2E I1024 12:56:04.983452 20388 magefile.go:529] setting up new custom bundle for testing... I1024 12:56:05.349489 20388 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761310565-qdqs -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: docker-build to image I1024 12:56:07.151201 20388 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761310565-qdqs: quay.io/redhat-appstudio-qe/test-images@sha256:3089aa11a52cf7c322d09d7d07e681509f708bc1b0bfdc9b77732bae2a2ee182 I1024 12:56:07.151222 20388 magefile.go:535] To use the custom docker bundle locally, run below cmd: export CUSTOM_DOCKER_BUILD_PIPELINE_BUNDLE=quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761310565-qdqs I1024 12:56:07.151247 20388 integration_service.go:49] checking if repository is integration-service I1024 12:56:07.151252 20388 image_controller.go:49] checking if repository is image-controller I1024 12:56:07.151258 20388 build_service.go:49] checking if repository is build-service I1024 12:56:07.151265 20388 e2e_repo.go:347] checking if repository is e2e-tests I1024 12:56:07.151269 20388 e2e_repo.go:335] multi-platform tests and require sprayproxy registering are set to TRUE exec: git "diff" "--name-status" "upstream/main..HEAD" I1024 12:56:07.154032 20388 util.go:460] The following files, go.mod, go.sum, were changed! exec: go "install" "-mod=mod" "github.com/onsi/ginkgo/v2/ginkgo" go: downloading github.com/go-task/slim-sprig/v3 v3.0.0 go: downloading github.com/google/pprof v0.0.0-20251007162407-5df77e3f7d1d I1024 12:56:10.269060 20388 install.go:188] cloning 'https://github.com/redhat-appstudio/infra-deployments' with git ref 'refs/heads/main' Enumerating objects: 66640, done. Counting objects: 0% (1/237) Counting objects: 1% (3/237) Counting objects: 2% (5/237) Counting objects: 3% (8/237) Counting objects: 4% (10/237) Counting objects: 5% (12/237) Counting objects: 6% (15/237) Counting objects: 7% (17/237) Counting objects: 8% (19/237) Counting objects: 9% (22/237) Counting objects: 10% (24/237) Counting objects: 11% (27/237) Counting objects: 12% (29/237) Counting objects: 13% (31/237) Counting objects: 14% (34/237) Counting objects: 15% (36/237) Counting objects: 16% (38/237) Counting objects: 17% (41/237) Counting objects: 18% (43/237) Counting objects: 19% (46/237) Counting objects: 20% (48/237) Counting objects: 21% (50/237) Counting objects: 22% (53/237) Counting objects: 23% (55/237) Counting objects: 24% (57/237) Counting objects: 25% (60/237) Counting objects: 26% (62/237) Counting objects: 27% (64/237) Counting objects: 28% (67/237) Counting objects: 29% (69/237) Counting objects: 30% (72/237) Counting objects: 31% (74/237) Counting objects: 32% (76/237) Counting objects: 33% (79/237) Counting objects: 34% (81/237) Counting objects: 35% (83/237) Counting objects: 36% (86/237) Counting objects: 37% (88/237) Counting objects: 38% (91/237) Counting objects: 39% (93/237) Counting objects: 40% (95/237) Counting objects: 41% (98/237) Counting objects: 42% (100/237) Counting objects: 43% (102/237) Counting objects: 44% (105/237) Counting objects: 45% (107/237) Counting objects: 46% (110/237) Counting objects: 47% (112/237) Counting objects: 48% (114/237) Counting objects: 49% (117/237) Counting objects: 50% (119/237) Counting objects: 51% (121/237) Counting objects: 52% (124/237) Counting objects: 53% (126/237) Counting objects: 54% (128/237) Counting objects: 55% (131/237) Counting objects: 56% (133/237) Counting objects: 57% (136/237) Counting objects: 58% (138/237) Counting objects: 59% (140/237) Counting objects: 60% (143/237) Counting objects: 61% (145/237) Counting objects: 62% (147/237) Counting objects: 63% (150/237) Counting objects: 64% (152/237) Counting objects: 65% (155/237) Counting objects: 66% (157/237) Counting objects: 67% (159/237) Counting objects: 68% (162/237) Counting objects: 69% (164/237) Counting objects: 70% (166/237) Counting objects: 71% (169/237) Counting objects: 72% (171/237) Counting objects: 73% (174/237) Counting objects: 74% (176/237) Counting objects: 75% (178/237) Counting objects: 76% (181/237) Counting objects: 77% (183/237) Counting objects: 78% (185/237) Counting objects: 79% (188/237) Counting objects: 80% (190/237) Counting objects: 81% (192/237) Counting objects: 82% (195/237) Counting objects: 83% (197/237) Counting objects: 84% (200/237) Counting objects: 85% (202/237) Counting objects: 86% (204/237) Counting objects: 87% (207/237) Counting objects: 88% (209/237) Counting objects: 89% (211/237) Counting objects: 90% (214/237) Counting objects: 91% (216/237) Counting objects: 92% (219/237) Counting objects: 93% (221/237) Counting objects: 94% (223/237) Counting objects: 95% (226/237) Counting objects: 96% (228/237) Counting objects: 97% (230/237) Counting objects: 98% (233/237) Counting objects: 99% (235/237) Counting objects: 100% (237/237) Counting objects: 100% (237/237), done. Compressing objects: 0% (1/112) Compressing objects: 1% (2/112) Compressing objects: 2% (3/112) Compressing objects: 3% (4/112) Compressing objects: 4% (5/112) Compressing objects: 5% (6/112) Compressing objects: 6% (7/112) Compressing objects: 7% (8/112) Compressing objects: 8% (9/112) Compressing objects: 9% (11/112) Compressing objects: 10% (12/112) Compressing objects: 11% (13/112) Compressing objects: 12% (14/112) Compressing objects: 13% (15/112) Compressing objects: 14% (16/112) Compressing objects: 15% (17/112) Compressing objects: 16% (18/112) Compressing objects: 17% (20/112) Compressing objects: 18% (21/112) Compressing objects: 19% (22/112) Compressing objects: 20% (23/112) Compressing objects: 21% (24/112) Compressing objects: 22% (25/112) Compressing objects: 23% (26/112) Compressing objects: 24% (27/112) Compressing objects: 25% (28/112) Compressing objects: 26% (30/112) Compressing objects: 27% (31/112) Compressing objects: 28% (32/112) Compressing objects: 29% (33/112) Compressing objects: 30% (34/112) Compressing objects: 31% (35/112) Compressing objects: 32% (36/112) Compressing objects: 33% (37/112) Compressing objects: 34% (39/112) Compressing objects: 35% (40/112) Compressing objects: 36% (41/112) Compressing objects: 37% (42/112) Compressing objects: 38% (43/112) Compressing objects: 39% (44/112) Compressing objects: 40% (45/112) Compressing objects: 41% (46/112) Compressing objects: 42% (48/112) Compressing objects: 43% (49/112) Compressing objects: 44% (50/112) Compressing objects: 45% (51/112) Compressing objects: 46% (52/112) Compressing objects: 47% (53/112) Compressing objects: 48% (54/112) Compressing objects: 49% (55/112) Compressing objects: 50% (56/112) Compressing objects: 51% (58/112) Compressing objects: 52% (59/112) Compressing objects: 53% (60/112) Compressing objects: 54% (61/112) Compressing objects: 55% (62/112) Compressing objects: 56% (63/112) Compressing objects: 57% (64/112) Compressing objects: 58% (65/112) Compressing objects: 59% (67/112) Compressing objects: 60% (68/112) Compressing objects: 61% (69/112) Compressing objects: 62% (70/112) Compressing objects: 63% (71/112) Compressing objects: 64% (72/112) Compressing objects: 65% (73/112) Compressing objects: 66% (74/112) Compressing objects: 67% (76/112) Compressing objects: 68% (77/112) Compressing objects: 69% (78/112) Compressing objects: 70% (79/112) Compressing objects: 71% (80/112) Compressing objects: 72% (81/112) Compressing objects: 73% (82/112) Compressing objects: 74% (83/112) Compressing objects: 75% (84/112) Compressing objects: 76% (86/112) Compressing objects: 77% (87/112) Compressing objects: 78% (88/112) Compressing objects: 79% (89/112) Compressing objects: 80% (90/112) Compressing objects: 81% (91/112) Compressing objects: 82% (92/112) Compressing objects: 83% (93/112) Compressing objects: 84% (95/112) Compressing objects: 85% (96/112) Compressing objects: 86% (97/112) Compressing objects: 87% (98/112) Compressing objects: 88% (99/112) Compressing objects: 89% (100/112) Compressing objects: 90% (101/112) Compressing objects: 91% (102/112) Compressing objects: 92% (104/112) Compressing objects: 93% (105/112) Compressing objects: 94% (106/112) Compressing objects: 95% (107/112) Compressing objects: 96% (108/112) Compressing objects: 97% (109/112) Compressing objects: 98% (110/112) Compressing objects: 99% (111/112) Compressing objects: 100% (112/112) Compressing objects: 100% (112/112), done. Total 66640 (delta 170), reused 135 (delta 125), pack-reused 66403 (from 5) From https://github.com/redhat-appstudio/infra-deployments * branch main -> FETCH_HEAD Already up to date. Installing the OpenShift GitOps operator subscription: clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created subscription.operators.coreos.com/openshift-gitops-operator created Waiting for default project (and namespace) to exist: ....................................OK Waiting for OpenShift GitOps Route: OK argocd.argoproj.io/openshift-gitops patched argocd.argoproj.io/openshift-gitops patched Switch the Route to use re-encryption argocd.argoproj.io/openshift-gitops patched Restarting ArgoCD Server pod "openshift-gitops-server-78868c5878-2wsjz" deleted Allow any authenticated users to be admin on the Argo CD instance argocd.argoproj.io/openshift-gitops patched Mark Pending PVC as Healthy, workaround for WaitForFirstConsumer StorageClasses. Warning: unknown field "spec.resourceCustomizations" argocd.argoproj.io/openshift-gitops patched (no change) Setting kustomize build options argocd.argoproj.io/openshift-gitops patched Setting ignore Aggregated Roles argocd.argoproj.io/openshift-gitops patched Setting ArgoCD tracking method to annotation argocd.argoproj.io/openshift-gitops patched Restarting GitOps server deployment.apps/openshift-gitops-server restarted ========================================================================= Argo CD URL is: https://openshift-gitops-server-openshift-gitops.apps.rosa.kx-f5e8001f00.m40g.p3.openshiftapps.com (NOTE: It may take a few moments for the route to become available) Waiting for the route: ..........OK Login/password uses your OpenShift credentials ('Login with OpenShift' button) Setting secrets for Quality Dashboard namespace/quality-dashboard created secret/quality-dashboard-secrets created Creating secret for CI Helper App namespace/ci-helper-app created secret/ci-helper-app-secrets created Setting secrets for pipeline-service tekton-results namespace already exists, skipping creation tekton-logging namespace already exists, skipping creation namespace/product-kubearchive-logging created Creating DB secret secret/tekton-results-database created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config secret/minio-storage-configuration created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config MinIO config already exists, skipping creation Creating Postgres TLS certs ................+......+..........+++++++++++++++++++++++++++++++++++++++*........+.+......+..+.......+...+......+..+.......+......+..+............+...+...+....+.....+.+......+++++++++++++++++++++++++++++++++++++++*........+..+...+...+.......+.....+......+.+......+...............+..+..........+........+....+..................+..+.+..+..........+..+..............................+...+.+............+...+.....+......+..........+...+..+.++++++ ...+.+........+...+....+...+++++++++++++++++++++++++++++++++++++++*......+.....+.+...+......+..+....+.....+..........+.....+.+..+.+..+++++++++++++++++++++++++++++++++++++++*...................+..+...+.......+..+.+..+....+........+.......+........+................+.....+......+...+...+....+........+.......+.....+......+.......+...+...+..+.........+........................+.+.....+...+...............+.......+..................+........+.......+...+..+.+......+...............+.........+.....+......+....+.....+.+.........+.........+...+..+...+.......+...+........+......+...+............+...+..........+...+...+.........+.....+................+.........+..................+...+..+.......+.....+.......+.....+......+...+.+........+...+..........+..+.+.....+.+.....+.+...+.................+....+..........................+....+........+.+.....+.......+.....+...+................+......+..............+.+........+....+...............+...+.........+...............+........+.......+.....+....+.....++++++ ----- Certificate request self-signature ok subject=CN=cluster.local .+.....+..........+...+........+.......+...+..+......+.+.........+..+++++++++++++++++++++++++++++++++++++++*..+............+.............+......+......+++++++++++++++++++++++++++++++++++++++*.+......+..........+..+..................+...+...............+.+......+........+............+...+......+...+............+.+........+......+....+.........+......+.....+............+..........+..+.......+......+..+.....................+..........+...........+.............+........+....+...+.....+.+............+...+......+.....+......+.......+.....+.......+...........+..........+...+..+....+.....+......+...+.........+...+...+....+......+.....+.........+...+....+......+...............+........+..........+......+........+......+.....................+.+..............+.+...+..+..........++++++ .....+.................+.......+.........+..+....+...+++++++++++++++++++++++++++++++++++++++*..+.....+.........+...+.+..+.........+.........+++++++++++++++++++++++++++++++++++++++*......+......+...+......+......+.........+.........+..+................+............+.....+.............+........+.........+...++++++ ----- Certificate request self-signature ok subject=CN=postgres-postgresql.tekton-results.svc.cluster.local secret/postgresql-tls created configmap/rds-root-crt created namespace/application-service created Creating a has secret from legacy token secret/has-github-token created Creating a secret with a token for Image Controller namespace/image-controller created secret/quaytoken created Configuring the cluster with a pull secret for Docker Hub Saved credentials for docker.io into /tmp/tmp.kni41pMEzd secret/pull-secret data updated Saved credentials for docker.io into /tmp/tmp.kni41pMEzd secret/docker-io-pull created serviceaccount/appstudio-pipeline created Setting secrets for Dora metrics exporter namespace/dora-metrics created secret/exporters-secret created Setting Cluster Mode: preview Switched to a new branch 'preview-main-ubmg' labeling node/ip-10-0-136-118.ec2.internal... node/ip-10-0-136-118.ec2.internal labeled successfully labeled node/ip-10-0-136-118.ec2.internal labeling node/ip-10-0-156-149.ec2.internal... node/ip-10-0-156-149.ec2.internal labeled successfully labeled node/ip-10-0-156-149.ec2.internal labeling node/ip-10-0-160-157.ec2.internal... node/ip-10-0-160-157.ec2.internal labeled successfully labeled node/ip-10-0-160-157.ec2.internal verifying labels... all nodes labeled successfully. Detected OCP minor version: 17 Changing AppStudio Gitlab Org to "redhat-appstudio-qe" [preview-main-ubmg b29e78089] Preview mode, do not merge into main 6 files changed, 12 insertions(+), 18 deletions(-) remote: remote: Create a pull request for 'preview-main-ubmg' on GitHub by visiting: remote: https://github.com/redhat-appstudio-qe/infra-deployments/pull/new/preview-main-ubmg remote: To https://github.com/redhat-appstudio-qe/infra-deployments.git * [new branch] preview-main-ubmg -> preview-main-ubmg branch 'preview-main-ubmg' set up to track 'qe/preview-main-ubmg'. application.argoproj.io/all-application-sets created Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app application.argoproj.io/disable-csvcopy-in-cluster-local patched application.argoproj.io/cert-manager-in-cluster-local patched application.argoproj.io/build-templates-in-cluster-local patched application.argoproj.io/dora-metrics-in-cluster-local patched application.argoproj.io/konflux-kite-in-cluster-local patched application.argoproj.io/all-application-sets patched application.argoproj.io/integration-in-cluster-local patched application.argoproj.io/internal-services-in-cluster-local patched application.argoproj.io/tracing-workload-otel-collector-in-cluster-local patched application.argoproj.io/image-controller-in-cluster-local patched application.argoproj.io/enterprise-contract-in-cluster-local patched application.argoproj.io/konflux-rbac-in-cluster-local patched application.argoproj.io/tempo-in-cluster-local patched application.argoproj.io/has-in-cluster-local patched application.argoproj.io/application-api-in-cluster-local patched application.argoproj.io/kubearchive-in-cluster-local patched application.argoproj.io/squid-in-cluster-local patched application.argoproj.io/build-service-in-cluster-local patched application.argoproj.io/vector-tekton-logs-collector-in-cluster-local patched application.argoproj.io/mintmaker-in-cluster-local patched application.argoproj.io/pipeline-service-in-cluster-local patched application.argoproj.io/release-in-cluster-local patched application.argoproj.io/perf-team-prometheus-reader-in-cluster-local patched application.argoproj.io/monitoring-workload-grafana-in-cluster-local patched application.argoproj.io/tracing-workload-tracing-in-cluster-local patched application.argoproj.io/knative-eventing-in-cluster-local patched application.argoproj.io/trust-manager-in-cluster-local patched application.argoproj.io/repository-validator-in-cluster-local patched application.argoproj.io/multi-platform-controller-in-cluster-local patched application.argoproj.io/vector-kubearchive-log-collector-in-cluster-local patched (no change) application.argoproj.io/crossplane-control-plane-in-cluster-local patched application.argoproj.io/kueue-in-cluster-local patched application.argoproj.io/kyverno-in-cluster-local patched application.argoproj.io/policies-in-cluster-local patched application.argoproj.io/monitoring-workload-prometheus-in-cluster-local patched application.argoproj.io/project-controller-in-cluster-local patched pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync All Applications are synced and Healthy All required tekton resources are installed and ready Tekton CRDs are ready Setup Pac with existing QE sprayproxy and github App namespace/openshift-pipelines configured namespace/build-service configured namespace/integration-service configured secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created Configured pipelines-as-code-secret secret in openshift-pipelines namespace Switched to branch 'main' Your branch is up to date with 'upstream/main'. [controller-runtime] log.SetLogger(...) was never called; logs will not be displayed. Detected at: > goroutine 104 [running]: > runtime/debug.Stack() > /usr/lib/golang/src/runtime/debug/stack.go:26 +0x5e > sigs.k8s.io/controller-runtime/pkg/log.eventuallyFulfillRoot() > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/log.go:60 +0xcd > sigs.k8s.io/controller-runtime/pkg/log.(*delegatingLogSink).WithName(0xc0005c7a40, {0x2f940a0, 0x14}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/deleg.go:147 +0x3e > github.com/go-logr/logr.Logger.WithName({{0x36edb90, 0xc0005c7a40}, 0x0}, {0x2f940a0?, 0x0?}) > /opt/app-root/src/go/pkg/mod/github.com/go-logr/logr@v1.4.2/logr.go:345 +0x36 > sigs.k8s.io/controller-runtime/pkg/client.newClient(0x2d71b00?, {0x0, 0xc0001685b0, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:129 +0xf1 > sigs.k8s.io/controller-runtime/pkg/client.New(0xc00092f448?, {0x0, 0xc0001685b0, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:110 +0x7d > github.com/konflux-ci/e2e-tests/pkg/clients/kubernetes.NewAdminKubernetesClient() > /tmp/tmp.MgeSCHnNQK/pkg/clients/kubernetes/client.go:157 +0xa5 > github.com/konflux-ci/e2e-tests/pkg/clients/sprayproxy.GetPaCHost() > /tmp/tmp.MgeSCHnNQK/pkg/clients/sprayproxy/sprayproxy.go:93 +0x1c > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.registerPacServer() > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/repos/common.go:426 +0x78 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.init.func8(0xc000244008?) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/repos/common.go:378 +0x25 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.ActionFunc.Execute(0xc?, 0x2f6eb15?) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:279 +0x19 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Apply(...) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:315 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x5246200, 0xc000244008) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:348 +0xb3 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x523ec80?, 0xc000e0bc00?, 0x1f1bc99?}, 0xc000244008) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x52462c0, 0xc000244008) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:340 +0x2b > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x5247f40?, 0x4295dc?, 0x52c9b40?}, 0xc000244008) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).runLoadedCatalog(0x527d970, {0xc001556008?, 0xc00120be60?, 0x47?}, 0xc000244008) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:129 +0x119 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).RunRulesOfCategory(0x527d970, {0x2f68ec3, 0x2}, 0xc000244008) > /tmp/tmp.MgeSCHnNQK/magefiles/rulesengine/types.go:121 +0x1b4 > main.CI.TestE2E({}) > /tmp/tmp.MgeSCHnNQK/magefiles/magefile.go:330 +0x18a > main.main.func19({0x0?, 0x0?}) > /tmp/tmp.MgeSCHnNQK/magefiles/mage_output_file.go:827 +0xf > main.main.func12.1() > /tmp/tmp.MgeSCHnNQK/magefiles/mage_output_file.go:302 +0x5b > created by main.main.func12 in goroutine 1 > /tmp/tmp.MgeSCHnNQK/magefiles/mage_output_file.go:297 +0xbe I1024 13:17:13.389424 20388 common.go:434] Registered PaC server: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-f5e8001f00.m40g.p3.openshiftapps.com I1024 13:17:13.450728 20388 common.go:459] The PaC servers registered in Sprayproxy: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-549fb3de3d.4kk7.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-f5e8001f00.m40g.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.konflux-4-17-us-west-2-7j88w.konflux-qe.devcluster.openshift.com I1024 13:17:13.450751 20388 common.go:475] going to create new Tekton bundle remote-build for the purpose of testing multi-platform-controller PR I1024 13:17:13.863189 20388 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:e8648dff9859d89a0dbaedd120eec2e07afbffc850e706d16d0ac479a905efa0 I1024 13:17:13.865574 20388 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311833-bwnq -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1024 13:17:15.180529 20388 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311833-bwnq: quay.io/redhat-appstudio-qe/test-images@sha256:c537588ef31a5140707731f980a976d26e380314c9e1eea0e2395a488dd3c2df I1024 13:17:15.180550 20388 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_ARM64 to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311833-bwnq I1024 13:17:15.421678 20388 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:e8648dff9859d89a0dbaedd120eec2e07afbffc850e706d16d0ac479a905efa0 I1024 13:17:15.424246 20388 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311835-dhtl -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1024 13:17:16.779821 20388 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311835-dhtl: quay.io/redhat-appstudio-qe/test-images@sha256:c16f92c6952d4bbf1ac7c4ee6f037f67201cc4812f92f28d33bcf63b7b6084c7 I1024 13:17:16.779842 20388 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_S390X to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311835-dhtl I1024 13:17:17.138599 20388 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:e8648dff9859d89a0dbaedd120eec2e07afbffc850e706d16d0ac479a905efa0 I1024 13:17:17.140942 20388 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311836-bawl -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1024 13:17:18.702099 20388 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311836-bawl: quay.io/redhat-appstudio-qe/test-images@sha256:757e279f4854186136bfafb29d4612e71a2c7f484bbef830a3ba6d66a4c32aff I1024 13:17:18.702135 20388 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_PPC64LE to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311836-bawl exec: ginkgo "--seed=1761310564" "--timeout=1h30m0s" "--grace-period=30s" "--output-interceptor-mode=none" "--no-color" "--json-report=e2e-report.json" "--junit-report=e2e-report.xml" "--procs=20" "--nodes=20" "--p" "--output-dir=/workspace/artifact-dir" "./cmd" "--" go: downloading github.com/aws/aws-sdk-go-v2/config v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/ec2 v1.135.0 go: downloading github.com/IBM/go-sdk-core/v5 v5.15.3 go: downloading github.com/IBM/vpc-go-sdk v0.48.0 go: downloading github.com/aws/aws-sdk-go-v2 v1.32.7 go: downloading github.com/konflux-ci/build-service v0.0.0-20240611083846-2dee6cfe6fe4 go: downloading github.com/aws/smithy-go v1.22.1 go: downloading github.com/aws/aws-sdk-go-v2/credentials v1.17.48 go: downloading github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.22 go: downloading github.com/aws/aws-sdk-go-v2/internal/ini v1.8.1 go: downloading github.com/aws/aws-sdk-go-v2/service/sso v1.24.8 go: downloading github.com/aws/aws-sdk-go-v2/service/ssooidc v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/sts v1.33.3 go: downloading github.com/go-openapi/strfmt v0.22.0 go: downloading github.com/go-playground/validator/v10 v10.17.0 go: downloading github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.26 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.7 go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 go: downloading github.com/mitchellh/mapstructure v1.5.0 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.1 go: downloading github.com/oklog/ulid v1.3.1 go: downloading github.com/go-openapi/errors v0.21.0 go: downloading go.mongodb.org/mongo-driver v1.13.1 go: downloading github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.26 go: downloading github.com/google/go-github/v45 v45.2.0 go: downloading github.com/leodido/go-urn v1.3.0 go: downloading github.com/gabriel-vasile/mimetype v1.4.3 go: downloading github.com/go-playground/universal-translator v0.18.1 go: downloading github.com/go-playground/locales v0.14.1 Running Suite: Red Hat App Studio E2E tests - /tmp/tmp.MgeSCHnNQK/cmd ===================================================================== Random Seed: 1761310564 Will run 355 of 389 specs Running in parallel across 20 processes ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if release CR is created [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/release_to_github.go:139 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies the release pipelinerun is running and succeeds [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/release_to_github.go:149 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies release CR completed and set succeeded. [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/release_to_github.go:182 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if the Release exists in github repo [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/release_to_github.go:193 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-host-pool] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:120 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-host-pool] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:124 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-host-pool] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:127 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-host-pool] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:148 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created test that cleanup happened successfully [multi-platform, aws-host-pool] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:152 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:251 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:255 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:259 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:263 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, aws-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:267 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmz-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:341 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmz-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:345 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmz-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:349 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmz-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:353 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmz-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:357 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmp-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:432 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmp-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:436 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmp-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:440 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmp-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:444 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmp-dynamic] /tmp/tmp.MgeSCHnNQK/tests/build/multi-platform.go:448 ------------------------------ • [FAILED] [0.697 seconds] [release-pipelines-suite FBC e2e-tests] with FBC happy path [BeforeAll] Post-release verification creates component from git source https://github.com/redhat-appstudio-qe/fbc-sample-repo-test [release-pipelines, fbc-release, fbcHappyPath] [BeforeAll] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/fbc_release.go:89 [It] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/fbc_release.go:123 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:18:51.119 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 10/24/25 13:18:51.119 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc000b463c0>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc001331590>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc001331400>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:18:51.119 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSSSSSSSSS ------------------------------ • [FAILED] [0.725 seconds] [release-pipelines-suite e2e tests for multi arch with rh-advisories pipeline] Multi arch test happy path [BeforeAll] Post-release verification verifies the release CR is created [release-pipelines, rh-advisories, multiarch-advisories, multiArchAdvisories] [BeforeAll] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/multiarch_advisories.go:61 [It] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/multiarch_advisories.go:113 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:18:51.154 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 10/24/25 13:18:51.154 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc000ce4ed0>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc0004db7c0>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc0004db540>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:18:51.154 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS ------------------------------ • [FAILED] [0.556 seconds] [release-pipelines-suite e2e tests for rh-advisories pipeline] Rh-advisories happy path [BeforeAll] Post-release verification verifies if release CR is created [release-pipelines, rh-advisories, rhAdvisories] [BeforeAll] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_advisories.go:61 [It] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_advisories.go:118 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:18:51.287 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 10/24/25 13:18:51.287 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc00129ee10>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000c18280>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000c180f0>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:18:51.287 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS••• ------------------------------ • [FAILED] [4.585 seconds] [release-pipelines-suite e2e tests for rhtap-service-push pipeline] Rhtap-service-push happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rhtap-service-push, RhtapServicePush] [BeforeAll] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rhtap_service_push.go:75 [It] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rhtap_service_push.go:150 Timeline >> PR #3558 got created with sha 182476b24972459e43e22abb038e4aaaa794ccf0 merged result sha: c76a464c23163bb241f7dd9386fd9300f8825aff for PR #3558 [FAILED] in [BeforeAll] - /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rhtap_service_push.go:119 @ 10/24/25 13:18:54.909 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 10/24/25 13:18:54.909 << Timeline [FAILED] Unexpected error: <*fmt.wrapError | 0xc000b0ef40>: failed to get API group resources: unable to retrieve the complete list of server APIs: appstudio.redhat.com/v1alpha1: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/dev-release-team/apis/appstudio.redhat.com/v1alpha1": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { msg: "failed to get API group resources: unable to retrieve the complete list of server APIs: appstudio.redhat.com/v1alpha1: Get \"https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/dev-release-team/apis/appstudio.redhat.com/v1alpha1\": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host", err: <*apiutil.ErrResourceDiscoveryFailed | 0xc0000c7508>{ { Group: "appstudio.redhat.com", Version: "v1alpha1", }: <*url.Error | 0xc000e4c540>{ Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/dev-release-team/apis/appstudio.redhat.com/v1alpha1", Err: <*net.OpError | 0xc000b5bc70>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000b5bae0>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, }, }, } occurred In [BeforeAll] at: /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rhtap_service_push.go:119 @ 10/24/25 13:18:54.909 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS• ------------------------------ • [PANICKED] [12.926 seconds] [upgrade-suite Create users and check their state] [It] Verify AppStudioProvisionedUser [upgrade-verify] /tmp/tmp.MgeSCHnNQK/tests/upgrade/verifyWorkload.go:20 Timeline >> "msg"="Observed a panic: \"invalid memory address or nil pointer dereference\" (runtime error: invalid memory address or nil pointer dereference)\ngoroutine 101 [running]:\nk8s.io/apimachinery/pkg/util/runtime.logPanic({0x2c40d20, 0x53f0340})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:75 +0x85\nk8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc00087afc0?})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:49 +0x65\npanic({0x2c40d20?, 0x53f0340?})\n\t/usr/lib/golang/src/runtime/panic.go:792 +0x132\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp.func1()\n\t/tmp/tmp.MgeSCHnNQK/pkg/sandbox/sandbox.go:329 +0x35\ngithub.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval.func1({0xee6b2800?, 0x0?})\n\t/tmp/tmp.MgeSCHnNQK/pkg/utils/util.go:138 +0x13\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1(0xc000d93600?, {0x383aec8?, 0xc0000f0bd0?})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:53 +0x52\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x383aec8, 0xc0000f0bd0}, {0x382f990, 0xc000d93600}, 0x1, 0x0, 0xc00090de68)\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:54 +0x115\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x383ad78?, 0x54a64c0?}, 0xee6b2800, 0x419be5?, 0x1, 0xc00090de68)\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/poll.go:48 +0xa5\ngithub.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval(0xa?, 0xc0004bfeb0?, 0x1?)\n\t/tmp/tmp.MgeSCHnNQK/pkg/utils/util.go:138 +0x45\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp(0x322865b?, {0x322865b?, 0x32252a5?}, 0x8?)\n\t/tmp/tmp.MgeSCHnNQK/pkg/sandbox/sandbox.go:328 +0x72\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreated(0x0, {0x322865b, 0x9})\n\t/tmp/tmp.MgeSCHnNQK/pkg/sandbox/sandbox.go:324 +0x4b\ngithub.com/konflux-ci/e2e-tests/tests/upgrade/verify.VerifyAppStudioProvisionedUser(0x0?)\n\t/tmp/tmp.MgeSCHnNQK/tests/upgrade/verify/verifyUsers.go:14 +0x25\ngithub.com/konflux-ci/e2e-tests/tests/upgrade.init.func1.2()\n\t/tmp/tmp.MgeSCHnNQK/tests/upgrade/verifyWorkload.go:21 +0x1a\ngithub.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x7c2976?, 0xc0015a8300?})\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/node.go:475 +0x13\ngithub.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func3()\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/suite.go:894 +0x7b\ncreated by github.com/onsi/ginkgo/v2/internal.(*Suite).runNode in goroutine 82\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/suite.go:881 +0xd7b" "error"=null [PANICKED] in [It] - /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 @ 10/24/25 13:19:03.649 << Timeline [PANICKED] Test Panicked In [It] at: /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 @ 10/24/25 13:19:03.649 runtime error: invalid memory address or nil pointer dereference Full Stack Trace k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc00087afc0?}) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 +0xc7 panic({0x2c40d20?, 0x53f0340?}) /usr/lib/golang/src/runtime/panic.go:792 +0x132 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp.func1() /tmp/tmp.MgeSCHnNQK/pkg/sandbox/sandbox.go:329 +0x35 github.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval.func1({0xee6b2800?, 0x0?}) /tmp/tmp.MgeSCHnNQK/pkg/utils/util.go:138 +0x13 k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1(0xc000d93600?, {0x383aec8?, 0xc0000f0bd0?}) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:53 +0x52 k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x383aec8, 0xc0000f0bd0}, {0x382f990, 0xc000d93600}, 0x1, 0x0, 0xc0011e9e68) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:54 +0x115 k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x383ad78?, 0x54a64c0?}, 0xee6b2800, 0x419be5?, 0x1, 0xc00090de68) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/poll.go:48 +0xa5 github.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval(0xa?, 0xc0004bfeb0?, 0x1?) /tmp/tmp.MgeSCHnNQK/pkg/utils/util.go:138 +0x45 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp(0x322865b?, {0x322865b?, 0x32252a5?}, 0x8?) /tmp/tmp.MgeSCHnNQK/pkg/sandbox/sandbox.go:328 +0x72 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreated(0x0, {0x322865b, 0x9}) /tmp/tmp.MgeSCHnNQK/pkg/sandbox/sandbox.go:324 +0x4b github.com/konflux-ci/e2e-tests/tests/upgrade/verify.VerifyAppStudioProvisionedUser(0x0?) /tmp/tmp.MgeSCHnNQK/tests/upgrade/verify/verifyUsers.go:14 +0x25 github.com/konflux-ci/e2e-tests/tests/upgrade.init.func1.2() /tmp/tmp.MgeSCHnNQK/tests/upgrade/verifyWorkload.go:21 +0x1a ------------------------------ SS ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context points to a file [build-templates] /tmp/tmp.MgeSCHnNQK/tests/build/tkn-bundle.go:177 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles from specific context [build-templates] /tmp/tmp.MgeSCHnNQK/tests/build/tkn-bundle.go:188 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context is the root directory [build-templates] /tmp/tmp.MgeSCHnNQK/tests/build/tkn-bundle.go:198 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when context points to a file and a directory [build-templates] /tmp/tmp.MgeSCHnNQK/tests/build/tkn-bundle.go:207 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when using negation [build-templates] /tmp/tmp.MgeSCHnNQK/tests/build/tkn-bundle.go:217 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding HOME environment variable [build-templates] /tmp/tmp.MgeSCHnNQK/tests/build/tkn-bundle.go:227 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding STEP image [build-templates] /tmp/tmp.MgeSCHnNQK/tests/build/tkn-bundle.go:236 ------------------------------ • [FAILED] [0.754 seconds] [release-pipelines-suite e2e tests for rh-push-to-redhat-io pipeline] Rh-push-to-redhat-io happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rh-push-to-registry-redhat-io, PushToRedhatIO] [BeforeAll] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_push_to_registry_redhat_io.go:61 [It] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_push_to_registry_redhat_io.go:110 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:19:04.02 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 10/24/25 13:19:04.404 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc0016da5a0>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/workspaces/managed-release-team/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000e7c370>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000cefea0>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 @ 10/24/25 13:19:04.02 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS•••••••••••••••• ------------------------------ • [FAILED] [71.792 seconds] [build-service-suite Build templates E2E test] HACBS pipelines [BeforeAll] triggers PipelineRun for symlink component with source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic with component name test-symlink-comp-czct [build, build-templates, HACBS, pipeline-service, pipeline, build-templates-e2e, source-build-e2e] [BeforeAll] /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:235 [It] /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:319 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:261 @ 10/24/25 13:19:02.535 error while getting pipelineruns: no pipelinerun found for application test-app-nile error while getting pipelineruns: no pipelinerun found for application test-app-nile error while getting pipelineruns: no pipelinerun found for application test-app-nile error while getting pipelineruns: no pipelinerun found for application test-app-nile error while getting pipelineruns: no pipelinerun found for application test-app-nile error while getting pipelineruns: no pipelinerun found for application test-app-nile [FAILED] in [AfterAll] - /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:287 @ 10/24/25 13:20:02.536 << Timeline [FAILED] failed to create component for scenario: sample-python-basic-oci Unexpected error: <*errors.errorString | 0xc000e76a10>: failed to update BUILDAH_FORMAT in the pipeline bundle with: error when building/pushing a tekton pipeline bundle: error when pushing a bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311942-esyi to a container image registry repo: could not push image to registry as "quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311942-esyi": POST https://quay.io/v2/redhat-appstudio-qe/test-images/blobs/uploads/: UNAUTHORIZED: access to the requested resource is not authorized; map[] { s: "failed to update BUILDAH_FORMAT in the pipeline bundle with: error when building/pushing a tekton pipeline bundle: error when pushing a bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311942-esyi to a container image registry repo: could not push image to registry as \"quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761311942-esyi\": POST https://quay.io/v2/redhat-appstudio-qe/test-images/blobs/uploads/: UNAUTHORIZED: access to the requested resource is not authorized; map[]\n", } occurred In [BeforeAll] at: /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:261 @ 10/24/25 13:19:02.535 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSS ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:489 ------------------------------ SSSSSSSSSS ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build-oci-ta should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:489 ------------------------------ SSSSS•••••••••••••••••••••••••••••••••••••••••••••••• ------------------------------ P [PENDING] [build-service-suite Build service E2E tests] test build secret lookup when two secrets are created when second component is deleted, pac pr branch should not exist in the repo [build-service, pac-build, secret-lookup] /tmp/tmp.MgeSCHnNQK/tests/build/build.go:1118 ------------------------------ •••••••••••• ------------------------------ • [FAILED] [370.495 seconds] [release-pipelines-suite [HACBS-1571]test-release-e2e-push-image-to-pyxis] Post-release verification [It] verifies a release PipelineRun is started and succeeded in managed namespace [release-pipelines, rh-push-to-external-registry] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_push_to_external_registry.go:226 Timeline >> PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun has not been created yet for release %s/%s push-pyxis-zrpy snapshot-sample-iqme-8gzfr PipelineRun managed-bsk9b reason: ResolvingPipelineRef PipelineRun managed-bsk9b reason: ResolvingTaskRef PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Running PipelineRun managed-bsk9b reason: Failed [FAILED] in [It] - /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_push_to_external_registry.go:230 @ 10/24/25 13:25:57.033 << Timeline [FAILED] Error when waiting for a release pipelinerun for release push-pyxis-zrpy/snapshot-sample-iqme-8gzfr to finish Expected success, but got an error: <*errors.errorString | 0xc000436870>: Pipelinerun 'managed-bsk9b' didn't succeed Logs from failed container 'managed-bsk9b-push-rpm-data-to-pyxis/step-create-trusted-artifact': 2025/10/24 13:25:40 Skipping step because a previous step failed { s: "Pipelinerun 'managed-bsk9b' didn't succeed\nLogs from failed container 'managed-bsk9b-push-rpm-data-to-pyxis/step-create-trusted-artifact': \n2025/10/24 13:25:40 Skipping step because a previous step failed\n", } In [It] at: /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_push_to_external_registry.go:230 @ 10/24/25 13:25:57.033 ------------------------------ SSS•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••• Summarizing 8 Failures: [PANICKED!] [upgrade-suite Create users and check their state] [It] Verify AppStudioProvisionedUser [upgrade-verify] /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 [FAIL] [release-pipelines-suite e2e tests for rh-push-to-redhat-io pipeline] Rh-push-to-redhat-io happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rh-push-to-registry-redhat-io, PushToRedhatIO] /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 [FAIL] [release-pipelines-suite FBC e2e-tests] with FBC happy path [BeforeAll] Post-release verification creates component from git source https://github.com/redhat-appstudio-qe/fbc-sample-repo-test [release-pipelines, fbc-release, fbcHappyPath] /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 [FAIL] [build-service-suite Build templates E2E test] HACBS pipelines [BeforeAll] triggers PipelineRun for symlink component with source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic with component name test-symlink-comp-czct [build, build-templates, HACBS, pipeline-service, pipeline, build-templates-e2e, source-build-e2e] /tmp/tmp.MgeSCHnNQK/tests/build/build_templates.go:261 [FAIL] [release-pipelines-suite [HACBS-1571]test-release-e2e-push-image-to-pyxis] Post-release verification [It] verifies a release PipelineRun is started and succeeded in managed namespace [release-pipelines, rh-push-to-external-registry] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rh_push_to_external_registry.go:230 [FAIL] [release-pipelines-suite e2e tests for rh-advisories pipeline] Rh-advisories happy path [BeforeAll] Post-release verification verifies if release CR is created [release-pipelines, rh-advisories, rhAdvisories] /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 [FAIL] [release-pipelines-suite e2e tests for rhtap-service-push pipeline] Rhtap-service-push happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rhtap-service-push, RhtapServicePush] /tmp/tmp.MgeSCHnNQK/tests/release/pipelines/rhtap_service_push.go:119 [FAIL] [release-pipelines-suite e2e tests for multi arch with rh-advisories pipeline] Multi arch test happy path [BeforeAll] Post-release verification verifies the release CR is created [release-pipelines, rh-advisories, multiarch-advisories, multiArchAdvisories] /tmp/tmp.MgeSCHnNQK/tests/release/releaseLib.go:323 Ran 300 of 389 Specs in 1960.174 seconds FAIL! -- 292 Passed | 8 Failed | 34 Pending | 55 Skipped Ginkgo ran 1 suite in 34m11.758335121s Test Suite Failed Error: running "ginkgo --seed=1761310564 --timeout=1h30m0s --grace-period=30s --output-interceptor-mode=none --no-color --json-report=e2e-report.json --junit-report=e2e-report.xml --procs=20 --nodes=20 --p --output-dir=/workspace/artifact-dir ./cmd --" failed with exit code 1 make: *** [Makefile:25: ci/test/e2e] Error 1