./mage -v ci:teste2e Running target: CI:TestE2E I1022 22:59:45.710889 16655 magefile.go:521] setting up new custom bundle for testing... I1022 22:59:46.322069 16655 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761173986-vqoz -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: docker-build to image I1022 22:59:47.744466 16655 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761173986-vqoz: quay.io/redhat-appstudio-qe/test-images@sha256:a7761885266f4b5df98b4c8e63903c39b18c7e2b6ecf1e76e4cf3d2bc5efec51 I1022 22:59:47.744503 16655 magefile.go:527] To use the custom docker bundle locally, run below cmd: export CUSTOM_DOCKER_BUILD_PIPELINE_BUNDLE=quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761173986-vqoz I1022 22:59:47.744533 16655 e2e_repo.go:347] checking if repository is e2e-tests I1022 22:59:47.744541 16655 e2e_repo.go:335] multi-platform tests and require sprayproxy registering are set to TRUE exec: git "diff" "--name-status" "upstream/main..HEAD" I1022 22:59:47.747647 16655 util.go:460] The following files, pkg/clients/has/components.go, pkg/clients/tekton/pipelineruns.go, pkg/utils/tekton/pipelineruns.go, tests/build/build_templates.go, were changed! exec: go "install" "-mod=mod" "github.com/onsi/ginkgo/v2/ginkgo" go: downloading github.com/go-task/slim-sprig/v3 v3.0.0 go: downloading github.com/google/pprof v0.0.0-20241210010833-40e02aabc2ad I1022 22:59:50.610325 16655 install.go:188] cloning 'https://github.com/redhat-appstudio/infra-deployments' with git ref 'refs/heads/main' Enumerating objects: 66496, done. Counting objects: 0% (1/153) Counting objects: 1% (2/153) Counting objects: 2% (4/153) Counting objects: 3% (5/153) Counting objects: 4% (7/153) Counting objects: 5% (8/153) Counting objects: 6% (10/153) Counting objects: 7% (11/153) Counting objects: 8% (13/153) Counting objects: 9% (14/153) Counting objects: 10% (16/153) Counting objects: 11% (17/153) Counting objects: 12% (19/153) Counting objects: 13% (20/153) Counting objects: 14% (22/153) Counting objects: 15% (23/153) Counting objects: 16% (25/153) Counting objects: 17% (27/153) Counting objects: 18% (28/153) Counting objects: 19% (30/153) Counting objects: 20% (31/153) Counting objects: 21% (33/153) Counting objects: 22% (34/153) Counting objects: 23% (36/153) Counting objects: 24% (37/153) Counting objects: 25% (39/153) Counting objects: 26% (40/153) Counting objects: 27% (42/153) Counting objects: 28% (43/153) Counting objects: 29% (45/153) Counting objects: 30% (46/153) Counting objects: 31% (48/153) Counting objects: 32% (49/153) Counting objects: 33% (51/153) Counting objects: 34% (53/153) Counting objects: 35% (54/153) Counting objects: 36% (56/153) Counting objects: 37% (57/153) Counting objects: 38% (59/153) Counting objects: 39% (60/153) Counting objects: 40% (62/153) Counting objects: 41% (63/153) Counting objects: 42% (65/153) Counting objects: 43% (66/153) Counting objects: 44% (68/153) Counting objects: 45% (69/153) Counting objects: 46% (71/153) Counting objects: 47% (72/153) Counting objects: 48% (74/153) Counting objects: 49% (75/153) Counting objects: 50% (77/153) Counting objects: 51% (79/153) Counting objects: 52% (80/153) Counting objects: 53% (82/153) Counting objects: 54% (83/153) Counting objects: 55% (85/153) Counting objects: 56% (86/153) Counting objects: 57% (88/153) Counting objects: 58% (89/153) Counting objects: 59% (91/153) Counting objects: 60% (92/153) Counting objects: 61% (94/153) Counting objects: 62% (95/153) Counting objects: 63% (97/153) Counting objects: 64% (98/153) Counting objects: 65% (100/153) Counting objects: 66% (101/153) Counting objects: 67% (103/153) Counting objects: 68% (105/153) Counting objects: 69% (106/153) Counting objects: 70% (108/153) Counting objects: 71% (109/153) Counting objects: 72% (111/153) Counting objects: 73% (112/153) Counting objects: 74% (114/153) Counting objects: 75% (115/153) Counting objects: 76% (117/153) Counting objects: 77% (118/153) Counting objects: 78% (120/153) Counting objects: 79% (121/153) Counting objects: 80% (123/153) Counting objects: 81% (124/153) Counting objects: 82% (126/153) Counting objects: 83% (127/153) Counting objects: 84% (129/153) Counting objects: 85% (131/153) Counting objects: 86% (132/153) Counting objects: 87% (134/153) Counting objects: 88% (135/153) Counting objects: 89% (137/153) Counting objects: 90% (138/153) Counting objects: 91% (140/153) Counting objects: 92% (141/153) Counting objects: 93% (143/153) Counting objects: 94% (144/153) Counting objects: 95% (146/153) Counting objects: 96% (147/153) Counting objects: 97% (149/153) Counting objects: 98% (150/153) Counting objects: 99% (152/153) Counting objects: 100% (153/153) Counting objects: 100% (153/153), done. Compressing objects: 1% (1/62) Compressing objects: 3% (2/62) Compressing objects: 4% (3/62) Compressing objects: 6% (4/62) Compressing objects: 8% (5/62) Compressing objects: 9% (6/62) Compressing objects: 11% (7/62) Compressing objects: 12% (8/62) Compressing objects: 14% (9/62) Compressing objects: 16% (10/62) Compressing objects: 17% (11/62) Compressing objects: 19% (12/62) Compressing objects: 20% (13/62) Compressing objects: 22% (14/62) Compressing objects: 24% (15/62) Compressing objects: 25% (16/62) Compressing objects: 27% (17/62) Compressing objects: 29% (18/62) Compressing objects: 30% (19/62) Compressing objects: 32% (20/62) Compressing objects: 33% (21/62) Compressing objects: 35% (22/62) Compressing objects: 37% (23/62) Compressing objects: 38% (24/62) Compressing objects: 40% (25/62) Compressing objects: 41% (26/62) Compressing objects: 43% (27/62) Compressing objects: 45% (28/62) Compressing objects: 46% (29/62) Compressing objects: 48% (30/62) Compressing objects: 50% (31/62) Compressing objects: 51% (32/62) Compressing objects: 53% (33/62) Compressing objects: 54% (34/62) Compressing objects: 56% (35/62) Compressing objects: 58% (36/62) Compressing objects: 59% (37/62) Compressing objects: 61% (38/62) Compressing objects: 62% (39/62) Compressing objects: 64% (40/62) Compressing objects: 66% (41/62) Compressing objects: 67% (42/62) Compressing objects: 69% (43/62) Compressing objects: 70% (44/62) Compressing objects: 72% (45/62) Compressing objects: 74% (46/62) Compressing objects: 75% (47/62) Compressing objects: 77% (48/62) Compressing objects: 79% (49/62) Compressing objects: 80% (50/62) Compressing objects: 82% (51/62) Compressing objects: 83% (52/62) Compressing objects: 85% (53/62) Compressing objects: 87% (54/62) Compressing objects: 88% (55/62) Compressing objects: 90% (56/62) Compressing objects: 91% (57/62) Compressing objects: 93% (58/62) Compressing objects: 95% (59/62) Compressing objects: 96% (60/62) Compressing objects: 98% (61/62) Compressing objects: 100% (62/62) Compressing objects: 100% (62/62), done. Total 66496 (delta 116), reused 91 (delta 91), pack-reused 66343 (from 2) From https://github.com/redhat-appstudio/infra-deployments * branch main -> FETCH_HEAD Already up to date. Installing the OpenShift GitOps operator subscription: clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created subscription.operators.coreos.com/openshift-gitops-operator created Waiting for default project (and namespace) to exist: ...................................OK Waiting for OpenShift GitOps Route: OK argocd.argoproj.io/openshift-gitops patched argocd.argoproj.io/openshift-gitops patched Switch the Route to use re-encryption argocd.argoproj.io/openshift-gitops patched Restarting ArgoCD Server pod "openshift-gitops-server-78868c5878-wd95t" deleted Allow any authenticated users to be admin on the Argo CD instance argocd.argoproj.io/openshift-gitops patched Mark Pending PVC as Healthy, workaround for WaitForFirstConsumer StorageClasses. Warning: unknown field "spec.resourceCustomizations" argocd.argoproj.io/openshift-gitops patched (no change) Setting kustomize build options argocd.argoproj.io/openshift-gitops patched Setting ignore Aggregated Roles argocd.argoproj.io/openshift-gitops patched Setting ArgoCD tracking method to annotation argocd.argoproj.io/openshift-gitops patched Restarting GitOps server deployment.apps/openshift-gitops-server restarted ========================================================================= Argo CD URL is: https://openshift-gitops-server-openshift-gitops.apps.rosa.kx-1c65cf446b.ncwm.p3.openshiftapps.com (NOTE: It may take a few moments for the route to become available) Waiting for the route: .........OK Login/password uses your OpenShift credentials ('Login with OpenShift' button) Setting secrets for Quality Dashboard namespace/quality-dashboard created secret/quality-dashboard-secrets created Creating secret for CI Helper App namespace/ci-helper-app created secret/ci-helper-app-secrets created [WARN] Namespace 'image-controller' does not exist. Creating it... namespace/image-controller created secret/sealights-token created [INFO] Secret 'sealights-token' has been created/updated in namespace 'image-controller'. [WARN] Namespace 'integration-service' does not exist. Creating it... namespace/integration-service created secret/sealights-token created [INFO] Secret 'sealights-token' has been created/updated in namespace 'integration-service'. [WARN] Namespace 'release-service' does not exist. Creating it... namespace/release-service created secret/sealights-token created [INFO] Secret 'sealights-token' has been created/updated in namespace 'release-service'. [WARN] Namespace 'build-service' does not exist. Creating it... namespace/build-service created secret/sealights-token created [INFO] Secret 'sealights-token' has been created/updated in namespace 'build-service'. Setting secrets for pipeline-service tekton-results namespace already exists, skipping creation tekton-logging namespace already exists, skipping creation namespace/product-kubearchive-logging created Creating DB secret secret/tekton-results-database created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config secret/minio-storage-configuration created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config MinIO config already exists, skipping creation Creating Postgres TLS certs .........+.......+++++++++++++++++++++++++++++++++++++++*........+..+....+...+..+++++++++++++++++++++++++++++++++++++++*....+...+......+..+....+...+......+..+.........+...+......+.+...........+...+.........+...+.......+..+....+..++++++ ...+.+.........+.....+...+....+.........+...........+....+..+.......+............+......+........+....+.....+...+....+..............+.......+..+...+...................+.....+...+.+.....+....+...........+....+..+.+........+......+....+..+.......+.....+++++++++++++++++++++++++++++++++++++++*.........+......+.+.........+...........+.+...+..+....+.....+.+........+......+....+..............+.+..+............+......+++++++++++++++++++++++++++++++++++++++*............+................+........+...+.......+........+....+...........+...+......+....+..+.......+..+....+.........+.....................+.....................+.....+..........+...........+.........+...+...+.........+......+...+.+.........+...........+.+..+......+.+..............+.......+..+......+......+...............+......+............+.+..+.............+......+...+............+.....+....+..+...+.+...+..+......+...+....+......+.....+......+............................+...+........+....+..++++++ ----- Certificate request self-signature ok subject=CN=cluster.local ...+..+++++++++++++++++++++++++++++++++++++++*.+...+....+...+++++++++++++++++++++++++++++++++++++++*.....+.......+.................+......+.....................+.........+..........+..++++++ .+...+.........+.............+++++++++++++++++++++++++++++++++++++++*.........+......+...+............+.+..+......+....+++++++++++++++++++++++++++++++++++++++*.+.........+.....+..........+..+......+....+...........+..................+.+..............+....+..+.+........+....+......+.........+..+...+.....................+.+........+.......+.....+.+....................+...+......+...+......+......+.+...+.........+..+.....................+...+.+......+...+..+...+.+.........+..............+.+..+......................+...+..............+...+.+......+.....+......+............+.+............+..............+...+.......+...+.....+.......+.....+...............+.+...+.........+..+....+.....+....+.....+.+............+..+.+.....+............+...................+...+...........+......+.+..+...+.......+.....+......+.+......+...+....................+......+.+.....+..........+..+...+.......+...........+....+..+....+......+........+.+............+.....+....+.........+.....+......+....+..+....+...+...+............+...+........+....+..+......+...............+...+.+........+.+.........+..+..........+......+..+............+.+..+.............+...+..+...+....+.....+.+....................+.+......+........+..........+..............+......+....+........+.............+..+.............+...........+...++++++ ----- Certificate request self-signature ok subject=CN=postgres-postgresql.tekton-results.svc.cluster.local secret/postgresql-tls created configmap/rds-root-crt created namespace/application-service created Creating a has secret from legacy token secret/has-github-token created Creating a secret with a token for Image Controller Warning: resource namespaces/image-controller is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by oc apply. oc apply should only be used on resources created declaratively by either oc create --save-config or oc apply. The missing annotation will be patched automatically. namespace/image-controller configured secret/quaytoken created Configuring the cluster with a pull secret for Docker Hub Saved credentials for docker.io into /tmp/tmp.L2e0W2725f secret/pull-secret data updated Saved credentials for docker.io into /tmp/tmp.L2e0W2725f secret/docker-io-pull created serviceaccount/appstudio-pipeline created Setting secrets for Dora metrics exporter namespace/dora-metrics created secret/exporters-secret created Setting Cluster Mode: preview Switched to a new branch 'preview-main-xogp' labeling node/ip-10-0-132-47.ec2.internal... node/ip-10-0-132-47.ec2.internal labeled successfully labeled node/ip-10-0-132-47.ec2.internal labeling node/ip-10-0-146-76.ec2.internal... node/ip-10-0-146-76.ec2.internal labeled successfully labeled node/ip-10-0-146-76.ec2.internal labeling node/ip-10-0-164-225.ec2.internal... node/ip-10-0-164-225.ec2.internal labeled successfully labeled node/ip-10-0-164-225.ec2.internal verifying labels... all nodes labeled successfully. Detected OCP minor version: 17 Changing AppStudio Gitlab Org to "redhat-appstudio-qe" [preview-main-xogp 777dad440] Preview mode, do not merge into main 6 files changed, 12 insertions(+), 18 deletions(-) remote: remote: Create a pull request for 'preview-main-xogp' on GitHub by visiting: remote: https://github.com/redhat-appstudio-qe/infra-deployments/pull/new/preview-main-xogp remote: To https://github.com/redhat-appstudio-qe/infra-deployments.git * [new branch] preview-main-xogp -> preview-main-xogp branch 'preview-main-xogp' set up to track 'qe/preview-main-xogp'. application.argoproj.io/all-application-sets created Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app application.argoproj.io/disable-csvcopy-in-cluster-local patched application.argoproj.io/kyverno-in-cluster-local patched application.argoproj.io/konflux-kite-in-cluster-local patched application.argoproj.io/trust-manager-in-cluster-local patched application.argoproj.io/cert-manager-in-cluster-local patched application.argoproj.io/enterprise-contract-in-cluster-local patched application.argoproj.io/crossplane-control-plane-in-cluster-local patched application.argoproj.io/tracing-workload-otel-collector-in-cluster-local patched application.argoproj.io/project-controller-in-cluster-local patched application.argoproj.io/dora-metrics-in-cluster-local patched application.argoproj.io/build-templates-in-cluster-local patched application.argoproj.io/internal-services-in-cluster-local patched application.argoproj.io/image-controller-in-cluster-local patched application.argoproj.io/integration-in-cluster-local patched application.argoproj.io/all-application-sets patched application.argoproj.io/repository-validator-in-cluster-local patched application.argoproj.io/konflux-rbac-in-cluster-local patched application.argoproj.io/kueue-in-cluster-local patched application.argoproj.io/tracing-workload-tracing-in-cluster-local patched application.argoproj.io/squid-in-cluster-local patched application.argoproj.io/vector-kubearchive-log-collector-in-cluster-local patched (no change) application.argoproj.io/application-api-in-cluster-local patched application.argoproj.io/vector-tekton-logs-collector-in-cluster-local patched application.argoproj.io/perf-team-prometheus-reader-in-cluster-local patched application.argoproj.io/tempo-in-cluster-local patched application.argoproj.io/ingresscontroller-in-cluster-local patched application.argoproj.io/pipeline-service-in-cluster-local patched application.argoproj.io/monitoring-workload-prometheus-in-cluster-local patched application.argoproj.io/has-in-cluster-local patched application.argoproj.io/kubearchive-in-cluster-local patched application.argoproj.io/policies-in-cluster-local patched application.argoproj.io/release-in-cluster-local patched application.argoproj.io/knative-eventing-in-cluster-local patched application.argoproj.io/monitoring-workload-grafana-in-cluster-local patched application.argoproj.io/mintmaker-in-cluster-local patched application.argoproj.io/build-service-in-cluster-local patched application.argoproj.io/multi-platform-controller-in-cluster-local patched application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy tracing-workload-tracing-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing monitoring-workload-prometheus-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy tracing-workload-tracing-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing monitoring-workload-prometheus-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing tracing-workload-tracing-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing tracing-workload-tracing-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy tracing-workload-tracing-in-cluster-local OutOfSync Healthy vector-kubearchive-log-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing kubearchive-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing kubearchive-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Degraded monitoring-workload-grafana-in-cluster-local OutOfSync Progressing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing kubearchive-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Progressing monitoring-workload-grafana-in-cluster-local OutOfSync Progressing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Healthy monitoring-workload-grafana-in-cluster-local OutOfSync Progressing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync policies-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync All Applications are synced and Healthy All required tekton resources are installed and ready Tekton CRDs are ready Setup Pac with existing QE sprayproxy and github App namespace/openshift-pipelines configured namespace/build-service configured namespace/integration-service configured secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created Configured pipelines-as-code-secret secret in openshift-pipelines namespace Switched to branch 'main' Your branch is up to date with 'upstream/main'. [controller-runtime] log.SetLogger(...) was never called; logs will not be displayed. Detected at: > goroutine 80 [running]: > runtime/debug.Stack() > /usr/lib/golang/src/runtime/debug/stack.go:26 +0x5e > sigs.k8s.io/controller-runtime/pkg/log.eventuallyFulfillRoot() > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/log.go:60 +0xcd > sigs.k8s.io/controller-runtime/pkg/log.(*delegatingLogSink).WithName(0xc000259740, {0x2f94060, 0x14}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/deleg.go:147 +0x3e > github.com/go-logr/logr.Logger.WithName({{0x36edaf0, 0xc000259740}, 0x0}, {0x2f94060?, 0x0?}) > /opt/app-root/src/go/pkg/mod/github.com/go-logr/logr@v1.4.2/logr.go:345 +0x36 > sigs.k8s.io/controller-runtime/pkg/client.newClient(0x2d71be0?, {0x0, 0xc000285730, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:129 +0xf1 > sigs.k8s.io/controller-runtime/pkg/client.New(0xc000a94008?, {0x0, 0xc000285730, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:110 +0x7d > github.com/konflux-ci/e2e-tests/pkg/clients/kubernetes.NewAdminKubernetesClient() > /tmp/tmp.vlYjbA9DEg/pkg/clients/kubernetes/client.go:157 +0xa5 > github.com/konflux-ci/e2e-tests/pkg/clients/sprayproxy.GetPaCHost() > /tmp/tmp.vlYjbA9DEg/pkg/clients/sprayproxy/sprayproxy.go:93 +0x1c > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.registerPacServer() > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/repos/common.go:426 +0x78 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.init.func8(0xc0009ecd88?) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/repos/common.go:378 +0x25 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.ActionFunc.Execute(0xc?, 0x2f6ead5?) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:279 +0x19 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Apply(...) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:315 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x5246200, 0xc0009ecd88) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:348 +0xb3 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x523ec80?, 0xc001803c00?, 0x1f1bc99?}, 0xc0009ecd88) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x52462c0, 0xc0009ecd88) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:340 +0x2b > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x5247f40?, 0x7fb3f8d00108?, 0x70?}, 0xc0009ecd88) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).runLoadedCatalog(0x527d970, {0xc00123c508?, 0xc000f09e60?, 0x47?}, 0xc0009ecd88) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:129 +0x119 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).RunRulesOfCategory(0x527d970, {0x2f68e83, 0x2}, 0xc0009ecd88) > /tmp/tmp.vlYjbA9DEg/magefiles/rulesengine/types.go:121 +0x1b4 > main.CI.TestE2E({}) > /tmp/tmp.vlYjbA9DEg/magefiles/magefile.go:322 +0x18a > main.main.func19({0xc00077bce0?, 0x0?}) > /tmp/tmp.vlYjbA9DEg/magefiles/mage_output_file.go:827 +0xf > main.main.func12.1() > /tmp/tmp.vlYjbA9DEg/magefiles/mage_output_file.go:302 +0x5b > created by main.main.func12 in goroutine 1 > /tmp/tmp.vlYjbA9DEg/magefiles/mage_output_file.go:297 +0xbe I1022 23:20:49.401856 16655 common.go:434] Registered PaC server: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-1c65cf446b.ncwm.p3.openshiftapps.com I1022 23:20:49.470900 16655 common.go:459] The PaC servers registered in Sprayproxy: https://pipelines-as-code-controller-openshift-pipelines.apps.konflux-4-18-us-west-2-dfn7w.konflux-qe.devcluster.openshift.com, https://pipelines-as-code-controller-openshift-pipelines.apps.konflux-4-17-us-west-2-vp6zs.konflux-qe.devcluster.openshift.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-1c65cf446b.ncwm.p3.openshiftapps.com I1022 23:20:49.470929 16655 common.go:475] going to create new Tekton bundle remote-build for the purpose of testing multi-platform-controller PR I1022 23:20:49.827822 16655 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:e8648dff9859d89a0dbaedd120eec2e07afbffc850e706d16d0ac479a905efa0 I1022 23:20:49.829617 16655 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175249-kqkl -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1022 23:20:51.177888 16655 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175249-kqkl: quay.io/redhat-appstudio-qe/test-images@sha256:c28755e9dd8ec751d427cd992cb89c43791e958586c9e9f8880a8a76eef1319c I1022 23:20:51.177926 16655 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_ARM64 to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175249-kqkl I1022 23:20:51.458937 16655 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:e8648dff9859d89a0dbaedd120eec2e07afbffc850e706d16d0ac479a905efa0 I1022 23:20:51.460710 16655 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175251-vbzh -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1022 23:20:52.748856 16655 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175251-vbzh: quay.io/redhat-appstudio-qe/test-images@sha256:908bc49d3d8227eb0294d5d5de6e12ecdd349f4c60a24818625544dadb446f9d I1022 23:20:52.748887 16655 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_S390X to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175251-vbzh I1022 23:20:53.146076 16655 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:e8648dff9859d89a0dbaedd120eec2e07afbffc850e706d16d0ac479a905efa0 I1022 23:20:53.147927 16655 util.go:521] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175252-htzl -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1022 23:20:54.471350 16655 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175252-htzl: quay.io/redhat-appstudio-qe/test-images@sha256:ee25578c68c661a0728098081ab4567c1a2c16bd67b3c100c381fc46d2e434fc I1022 23:20:54.471385 16655 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_PPC64LE to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1761175252-htzl exec: ginkgo "--seed=1761173985" "--timeout=1h30m0s" "--grace-period=30s" "--output-interceptor-mode=none" "--label-filter=!upgrade-create && !upgrade-verify && !upgrade-cleanup && !release-pipelines" "--no-color" "--json-report=e2e-report.json" "--junit-report=e2e-report.xml" "--procs=20" "--nodes=20" "--p" "--output-dir=/workspace/artifact-dir" "./cmd" "--" go: downloading github.com/konflux-ci/build-service v0.0.0-20240611083846-2dee6cfe6fe4 go: downloading github.com/IBM/go-sdk-core/v5 v5.15.3 go: downloading github.com/IBM/vpc-go-sdk v0.48.0 go: downloading github.com/aws/aws-sdk-go-v2 v1.32.7 go: downloading github.com/aws/aws-sdk-go-v2/config v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/ec2 v1.135.0 go: downloading github.com/go-playground/validator/v10 v10.17.0 go: downloading github.com/go-openapi/strfmt v0.22.0 go: downloading github.com/aws/smithy-go v1.22.1 go: downloading github.com/google/go-github/v45 v45.2.0 go: downloading github.com/aws/aws-sdk-go-v2/credentials v1.17.48 go: downloading github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.22 go: downloading github.com/aws/aws-sdk-go-v2/internal/ini v1.8.1 go: downloading github.com/aws/aws-sdk-go-v2/service/sso v1.24.8 go: downloading github.com/aws/aws-sdk-go-v2/service/ssooidc v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/sts v1.33.3 go: downloading github.com/mitchellh/mapstructure v1.5.0 go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 go: downloading github.com/go-openapi/errors v0.21.0 go: downloading github.com/oklog/ulid v1.3.1 go: downloading go.mongodb.org/mongo-driver v1.13.1 go: downloading github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.26 go: downloading github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.26 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.1 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.7 go: downloading github.com/gabriel-vasile/mimetype v1.4.3 go: downloading github.com/go-playground/universal-translator v0.18.1 go: downloading github.com/leodido/go-urn v1.3.0 go: downloading github.com/go-playground/locales v0.14.1 Running Suite: Red Hat App Studio E2E tests - /tmp/tmp.vlYjbA9DEg/cmd ===================================================================== Random Seed: 1761173985 Will run 309 of 387 specs Running in parallel across 20 processes SSSS ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if release CR is created [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.vlYjbA9DEg/tests/release/pipelines/release_to_github.go:139 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies the release pipelinerun is running and succeeds [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.vlYjbA9DEg/tests/release/pipelines/release_to_github.go:149 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies release CR completed and set succeeded. [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.vlYjbA9DEg/tests/release/pipelines/release_to_github.go:182 ------------------------------ SSSS ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if the Release exists in github repo [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.vlYjbA9DEg/tests/release/pipelines/release_to_github.go:193 ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSS ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-host-pool] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:120 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-host-pool] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:124 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-host-pool] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:127 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-host-pool] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:148 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created test that cleanup happened successfully [multi-platform, aws-host-pool] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:152 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:251 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:255 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:259 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:263 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, aws-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:267 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmz-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:341 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmz-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:345 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmz-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:349 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmz-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:353 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmz-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:357 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmp-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:432 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmp-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:436 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmp-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:440 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmp-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:444 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmp-dynamic] /tmp/tmp.vlYjbA9DEg/tests/build/multi-platform.go:448 ------------------------------ SSSS ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context points to a file [build-templates] /tmp/tmp.vlYjbA9DEg/tests/build/tkn-bundle.go:177 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles from specific context [build-templates] /tmp/tmp.vlYjbA9DEg/tests/build/tkn-bundle.go:188 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context is the root directory [build-templates] /tmp/tmp.vlYjbA9DEg/tests/build/tkn-bundle.go:198 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when context points to a file and a directory [build-templates] /tmp/tmp.vlYjbA9DEg/tests/build/tkn-bundle.go:207 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when using negation [build-templates] /tmp/tmp.vlYjbA9DEg/tests/build/tkn-bundle.go:217 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding HOME environment variable [build-templates] /tmp/tmp.vlYjbA9DEg/tests/build/tkn-bundle.go:227 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding STEP image [build-templates] /tmp/tmp.vlYjbA9DEg/tests/build/tkn-bundle.go:236 ------------------------------ SSS•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••• ------------------------------ P [PENDING] [build-service-suite Build service E2E tests] test build secret lookup when two secrets are created when second component is deleted, pac pr branch should not exist in the repo [build-service, pac-build, secret-lookup] /tmp/tmp.vlYjbA9DEg/tests/build/build.go:1118 ------------------------------ •••••••••••••••••••S•••S•• ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.vlYjbA9DEg/tests/build/build_templates.go:499 ------------------------------ ••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••S•S• ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build-oci-ta should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.vlYjbA9DEg/tests/build/build_templates.go:499 ------------------------------ •••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••• ------------------------------ • [FAILED] [168.467 seconds] [build-service-suite Build service E2E tests] test of component update with renovate gitlab when components are created in same namespace [It] the PipelineRun should eventually finish successfully for parent component [build-service, renovate, multi-component] /tmp/tmp.vlYjbA9DEg/tests/build/build.go:1469 Timeline >> PipelineRun gl-multi-component-parent-rrtj-on-pull-request-d4xk9 reason: ResolvingTaskRef PipelineRun gl-multi-component-parent-rrtj-on-pull-request-d4xk9 reason: Cancelled attempt 1/3: PipelineRun "gl-multi-component-parent-rrtj-on-pull-request-d4xk9" failed: PipelineRun build-e2e-szfb/gl-multi-component-parent-rrtj-on-pull-request-d4xk9 is completed, skipping finalizer removal for retry PipelineRun has not been created yet for the Component build-e2e-szfb/gl-multi-component-parent-rrtj PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: ResolvingTaskRef PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Running PipelineRun gl-multi-component-parent-rrtj-on-pull-request-wj9b4 reason: Completed [FAILED] in [It] - /tmp/tmp.vlYjbA9DEg/tests/build/build.go:1478 @ 10/22/25 23:38:31.686 << Timeline [FAILED] pipelinerun status results: [] Expected : not to be empty In [It] at: /tmp/tmp.vlYjbA9DEg/tests/build/build.go:1478 @ 10/22/25 23:38:31.686 ------------------------------ SSSSSSSSSSS••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••• Summarizing 1 Failure: [FAIL] [build-service-suite Build service E2E tests] test of component update with renovate gitlab when components are created in same namespace [It] the PipelineRun should eventually finish successfully for parent component [build-service, renovate, multi-component] /tmp/tmp.vlYjbA9DEg/tests/build/build.go:1478 Ran 294 of 387 Specs in 1898.187 seconds FAIL! -- 293 Passed | 1 Failed | 34 Pending | 59 Skipped Ginkgo ran 1 suite in 33m11.749172741s Test Suite Failed Error: running "ginkgo --seed=1761173985 --timeout=1h30m0s --grace-period=30s --output-interceptor-mode=none --label-filter=!upgrade-create && !upgrade-verify && !upgrade-cleanup && !release-pipelines --no-color --json-report=e2e-report.json --junit-report=e2e-report.xml --procs=20 --nodes=20 --p --output-dir=/workspace/artifact-dir ./cmd --" failed with exit code 1 make: *** [Makefile:25: ci/test/e2e] Error 1