./mage -v ci:teste2e go: downloading golang.org/x/sync v0.19.0 go: downloading golang.org/x/tools v0.40.0 go: downloading golang.org/x/term v0.38.0 go: downloading golang.org/x/crypto v0.46.0 go: downloading golang.org/x/exp v0.0.0-20251209150349-8475f28825e9 go: downloading golang.org/x/net v0.48.0 go: downloading golang.org/x/sys v0.39.0 go: downloading golang.org/x/text v0.32.0 Running target: CI:TestE2E I1219 17:26:49.036994 23276 magefile.go:529] setting up new custom bundle for testing... I1219 17:26:49.925085 23276 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766165209-yitf -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: docker-build to image I1219 17:26:51.964993 23276 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766165209-yitf: quay.io/redhat-appstudio-qe/test-images@sha256:3b99e74d5ad854e04ee66ad4239d15fdc8a27057278dc4f5fb85ac29cf7180f8 I1219 17:26:51.965018 23276 magefile.go:535] To use the custom docker bundle locally, run below cmd: export CUSTOM_DOCKER_BUILD_PIPELINE_BUNDLE=quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766165209-yitf I1219 17:26:51.965040 23276 build_service.go:49] checking if repository is build-service I1219 17:26:51.965046 23276 e2e_repo.go:347] checking if repository is e2e-tests I1219 17:26:51.965050 23276 e2e_repo.go:335] multi-platform tests and require sprayproxy registering are set to TRUE exec: git "diff" "--name-status" "upstream/main..HEAD" I1219 17:26:51.967832 23276 util.go:451] The following files, go.mod, go.sum, were changed! exec: go "install" "-mod=mod" "github.com/onsi/ginkgo/v2/ginkgo" go: downloading github.com/go-task/slim-sprig/v3 v3.0.0 go: downloading github.com/google/pprof v0.0.0-20241210010833-40e02aabc2ad I1219 17:26:55.121534 23276 install.go:188] cloning 'https://github.com/redhat-appstudio/infra-deployments' with git ref 'refs/heads/main' Enumerating objects: 70893, done. Counting objects: 0% (1/7559) Counting objects: 1% (76/7559) Counting objects: 2% (152/7559) Counting objects: 3% (227/7559) Counting objects: 4% (303/7559) Counting objects: 5% (378/7559) Counting objects: 6% (454/7559) Counting objects: 7% (530/7559) Counting objects: 8% (605/7559) Counting objects: 9% (681/7559) Counting objects: 10% (756/7559) Counting objects: 11% (832/7559) Counting objects: 12% (908/7559) Counting objects: 13% (983/7559) Counting objects: 14% (1059/7559) Counting objects: 15% (1134/7559) Counting objects: 16% (1210/7559) Counting objects: 17% (1286/7559) Counting objects: 18% (1361/7559) Counting objects: 19% (1437/7559) Counting objects: 20% (1512/7559) Counting objects: 21% (1588/7559) Counting objects: 22% (1663/7559) Counting objects: 23% (1739/7559) Counting objects: 24% (1815/7559) Counting objects: 25% (1890/7559) Counting objects: 26% (1966/7559) Counting objects: 27% (2041/7559) Counting objects: 28% (2117/7559) Counting objects: 29% (2193/7559) Counting objects: 30% (2268/7559) Counting objects: 31% (2344/7559) Counting objects: 32% (2419/7559) Counting objects: 33% (2495/7559) Counting objects: 34% (2571/7559) Counting objects: 35% (2646/7559) Counting objects: 36% (2722/7559) Counting objects: 37% (2797/7559) Counting objects: 38% (2873/7559) Counting objects: 39% (2949/7559) Counting objects: 40% (3024/7559) Counting objects: 41% (3100/7559) Counting objects: 42% (3175/7559) Counting objects: 43% (3251/7559) Counting objects: 44% (3326/7559) Counting objects: 45% (3402/7559) Counting objects: 46% (3478/7559) Counting objects: 47% (3553/7559) Counting objects: 48% (3629/7559) Counting objects: 49% (3704/7559) Counting objects: 50% (3780/7559) Counting objects: 51% (3856/7559) Counting objects: 52% (3931/7559) Counting objects: 53% (4007/7559) Counting objects: 54% (4082/7559) Counting objects: 55% (4158/7559) Counting objects: 56% (4234/7559) Counting objects: 57% (4309/7559) Counting objects: 58% (4385/7559) Counting objects: 59% (4460/7559) Counting objects: 60% (4536/7559) Counting objects: 61% (4611/7559) Counting objects: 62% (4687/7559) Counting objects: 63% (4763/7559) Counting objects: 64% (4838/7559) Counting objects: 65% (4914/7559) Counting objects: 66% (4989/7559) Counting objects: 67% (5065/7559) Counting objects: 68% (5141/7559) Counting objects: 69% (5216/7559) Counting objects: 70% (5292/7559) Counting objects: 71% (5367/7559) Counting objects: 72% (5443/7559) Counting objects: 73% (5519/7559) Counting objects: 74% (5594/7559) Counting objects: 75% (5670/7559) Counting objects: 76% (5745/7559) Counting objects: 77% (5821/7559) Counting objects: 78% (5897/7559) Counting objects: 79% (5972/7559) Counting objects: 80% (6048/7559) Counting objects: 81% (6123/7559) Counting objects: 82% (6199/7559) Counting objects: 83% (6274/7559) Counting objects: 84% (6350/7559) Counting objects: 85% (6426/7559) Counting objects: 86% (6501/7559) Counting objects: 87% (6577/7559) Counting objects: 88% (6652/7559) Counting objects: 89% (6728/7559) Counting objects: 90% (6804/7559) Counting objects: 91% (6879/7559) Counting objects: 92% (6955/7559) Counting objects: 93% (7030/7559) Counting objects: 94% (7106/7559) Counting objects: 95% (7182/7559) Counting objects: 96% (7257/7559) Counting objects: 97% (7333/7559) Counting objects: 98% (7408/7559) Counting objects: 99% (7484/7559) Counting objects: 100% (7559/7559) Counting objects: 100% (7559/7559), done. Compressing objects: 0% (1/457) Compressing objects: 1% (5/457) Compressing objects: 2% (10/457) Compressing objects: 3% (14/457) Compressing objects: 4% (19/457) Compressing objects: 5% (23/457) Compressing objects: 6% (28/457) Compressing objects: 7% (32/457) Compressing objects: 8% (37/457) Compressing objects: 9% (42/457) Compressing objects: 10% (46/457) Compressing objects: 11% (51/457) Compressing objects: 12% (55/457) Compressing objects: 13% (60/457) Compressing objects: 14% (64/457) Compressing objects: 15% (69/457) Compressing objects: 16% (74/457) Compressing objects: 17% (78/457) Compressing objects: 18% (83/457) Compressing objects: 19% (87/457) Compressing objects: 20% (92/457) Compressing objects: 21% (96/457) Compressing objects: 22% (101/457) Compressing objects: 23% (106/457) Compressing objects: 24% (110/457) Compressing objects: 25% (115/457) Compressing objects: 26% (119/457) Compressing objects: 27% (124/457) Compressing objects: 28% (128/457) Compressing objects: 29% (133/457) Compressing objects: 30% (138/457) Compressing objects: 31% (142/457) Compressing objects: 32% (147/457) Compressing objects: 33% (151/457) Compressing objects: 34% (156/457) Compressing objects: 35% (160/457) Compressing objects: 36% (165/457) Compressing objects: 37% (170/457) Compressing objects: 38% (174/457) Compressing objects: 39% (179/457) Compressing objects: 40% (183/457) Compressing objects: 41% (188/457) Compressing objects: 42% (192/457) Compressing objects: 43% (197/457) Compressing objects: 44% (202/457) Compressing objects: 45% (206/457) Compressing objects: 46% (211/457) Compressing objects: 47% (215/457) Compressing objects: 48% (220/457) Compressing objects: 49% (224/457) Compressing objects: 50% (229/457) Compressing objects: 51% (234/457) Compressing objects: 52% (238/457) Compressing objects: 53% (243/457) Compressing objects: 54% (247/457) Compressing objects: 55% (252/457) Compressing objects: 56% (256/457) Compressing objects: 57% (261/457) Compressing objects: 58% (266/457) Compressing objects: 59% (270/457) Compressing objects: 60% (275/457) Compressing objects: 61% (279/457) Compressing objects: 62% (284/457) Compressing objects: 63% (288/457) Compressing objects: 64% (293/457) Compressing objects: 65% (298/457) Compressing objects: 66% (302/457) Compressing objects: 67% (307/457) Compressing objects: 68% (311/457) Compressing objects: 69% (316/457) Compressing objects: 70% (320/457) Compressing objects: 71% (325/457) Compressing objects: 72% (330/457) Compressing objects: 73% (334/457) Compressing objects: 74% (339/457) Compressing objects: 75% (343/457) Compressing objects: 76% (348/457) Compressing objects: 77% (352/457) Compressing objects: 78% (357/457) Compressing objects: 79% (362/457) Compressing objects: 80% (366/457) Compressing objects: 81% (371/457) Compressing objects: 82% (375/457) Compressing objects: 83% (380/457) Compressing objects: 84% (384/457) Compressing objects: 85% (389/457) Compressing objects: 86% (394/457) Compressing objects: 87% (398/457) Compressing objects: 88% (403/457) Compressing objects: 89% (407/457) Compressing objects: 90% (412/457) Compressing objects: 91% (416/457) Compressing objects: 92% (421/457) Compressing objects: 93% (426/457) Compressing objects: 94% (430/457) Compressing objects: 95% (435/457) Compressing objects: 96% (439/457) Compressing objects: 97% (444/457) Compressing objects: 98% (448/457) Compressing objects: 99% (453/457) Compressing objects: 100% (457/457) Compressing objects: 100% (457/457), done. Total 70893 (delta 7214), reused 7108 (delta 7102), pack-reused 63334 (from 5) From https://github.com/redhat-appstudio/infra-deployments * branch main -> FETCH_HEAD Already up to date. Installing the OpenShift GitOps operator subscription: clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created subscription.operators.coreos.com/openshift-gitops-operator created Waiting for default project (and namespace) to exist: .................................OK Waiting for OpenShift GitOps Route: OK argocd.argoproj.io/openshift-gitops patched argocd.argoproj.io/openshift-gitops patched Switch the Route to use re-encryption argocd.argoproj.io/openshift-gitops patched Restarting ArgoCD Server pod "openshift-gitops-server-78868c5878-8wpsf" deleted Allow any authenticated users to be admin on the Argo CD instance argocd.argoproj.io/openshift-gitops patched Mark Pending PVC as Healthy, workaround for WaitForFirstConsumer StorageClasses. Warning: unknown field "spec.resourceCustomizations" argocd.argoproj.io/openshift-gitops patched (no change) Setting kustomize build options argocd.argoproj.io/openshift-gitops patched Setting ignore Aggregated Roles argocd.argoproj.io/openshift-gitops patched Setting ArgoCD tracking method to annotation argocd.argoproj.io/openshift-gitops patched Restarting GitOps server deployment.apps/openshift-gitops-server restarted ========================================================================= Argo CD URL is: https://openshift-gitops-server-openshift-gitops.apps.rosa.kx-e4b60de65d.54x8.p3.openshiftapps.com (NOTE: It may take a few moments for the route to become available) Waiting for the route: .....OK Login/password uses your OpenShift credentials ('Login with OpenShift' button) Setting secrets for Quality Dashboard namespace/quality-dashboard created secret/quality-dashboard-secrets created Creating secret for CI Helper App namespace/ci-helper-app created secret/ci-helper-app-secrets created Setting secrets for pipeline-service tekton-results namespace already exists, skipping creation tekton-logging namespace already exists, skipping creation namespace/product-kubearchive-logging created Creating DB secret secret/tekton-results-database created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config secret/minio-storage-configuration created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config MinIO config already exists, skipping creation Creating Postgres TLS certs ...+...........+.............+++++++++++++++++++++++++++++++++++++++*.+.+...+++++++++++++++++++++++++++++++++++++++*.......+...............+.....+............+...+..........+...+..+...+.+......+..+......+.......+...+.........+..+...+......+......+....+..+.........+......+....+.........+.........+.....+.+.........+...+..+.............+..+.+......+.....+.+...............+.................+.+..............+......+.+......+..............+...+.............+.....+...+......+......+...+.......+...+..+...+.........+.......+...+.....+.+......+...+...+...+..+...+....+...+........+.+........+..........+...+............+..+......+...+............+.........+......+.......+............+.....+...+.....................+....+...............+..+....+.....+....+.................+...+.+.....+.......+..+.+.........++++++ ...........+.+........+......+.........+..........+.....+......+++++++++++++++++++++++++++++++++++++++*....+...+............+...+.....+......+....+.....+++++++++++++++++++++++++++++++++++++++*.........+.+..+.........+.........+.+........+.+.....+.+...+...+.....+....+...+..+.........+..............................+.......+........+.+.....+...................+......+.....+......+...+......+....+..+.......+...+.........+...+.....+...+..........+..............+..........+...........+.......+..+......+...+..........+.....+....+.....+............+...+......+............+....+...............+.....+....+......+.....+......+.......+......+.........+.....+...+....+.........+..+............+.+..+...+....+...+.................+..........+...........+.+...+...+........................+...+..+....+..+............+.+.....+.+........+......+......+...+..................++++++ ----- Certificate request self-signature ok subject=CN=cluster.local ......+.+..+.........+.+.....+.......+...+.....+...+....+......+......+...+...........+.+...........+..................+++++++++++++++++++++++++++++++++++++++*......+..+++++++++++++++++++++++++++++++++++++++*...+..................+......+........+.+...............+.....+.++++++ ..........+.+......+...+...+...+.....+.........+...+...+.+.....+.+.....+.......+..+.........+....+...+..+.+..............+......+....+...+.........+.....+.+........+.........+.+......+++++++++++++++++++++++++++++++++++++++*..+.+...+++++++++++++++++++++++++++++++++++++++*.+.+.....+......+.+..+.......+.........+......+...........+...+.......+...+............+..+..........+....................+...+....+......+.................++++++ ----- Certificate request self-signature ok subject=CN=postgres-postgresql.tekton-results.svc.cluster.local secret/postgresql-tls created configmap/rds-root-crt created namespace/application-service created Creating a has secret from legacy token secret/has-github-token created Creating a secret with a token for Image Controller namespace/image-controller created secret/quaytoken created Configuring the cluster with a pull secret for Docker Hub Saved credentials for docker.io into /tmp/tmp.FYTH29FV34 secret/pull-secret data updated Saved credentials for docker.io into /tmp/tmp.FYTH29FV34 secret/docker-io-pull created Setting secrets for Dora metrics exporter namespace/dora-metrics created secret/exporters-secret created Setting Cluster Mode: preview Switched to a new branch 'preview-main-ktjo' labeling node/ip-10-0-137-98.ec2.internal... node/ip-10-0-137-98.ec2.internal labeled successfully labeled node/ip-10-0-137-98.ec2.internal labeling node/ip-10-0-150-176.ec2.internal... node/ip-10-0-150-176.ec2.internal labeled successfully labeled node/ip-10-0-150-176.ec2.internal labeling node/ip-10-0-161-116.ec2.internal... node/ip-10-0-161-116.ec2.internal labeled successfully labeled node/ip-10-0-161-116.ec2.internal verifying labels... all nodes labeled successfully. Detected OCP minor version: 17 Changing AppStudio Gitlab Org to "redhat-appstudio-qe" [preview-main-ktjo 64230c854] Preview mode, do not merge into main 6 files changed, 12 insertions(+), 18 deletions(-) remote: remote: Create a pull request for 'preview-main-ktjo' on GitHub by visiting: remote: https://github.com/redhat-appstudio-qe/infra-deployments/pull/new/preview-main-ktjo remote: To https://github.com/redhat-appstudio-qe/infra-deployments.git * [new branch] preview-main-ktjo -> preview-main-ktjo branch 'preview-main-ktjo' set up to track 'qe/preview-main-ktjo'. application.argoproj.io/all-application-sets created Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app application.argoproj.io/has-in-cluster-local patched application.argoproj.io/internal-services-in-cluster-local patched application.argoproj.io/disable-csvcopy-in-cluster-local patched application.argoproj.io/integration-in-cluster-local patched application.argoproj.io/tracing-workload-otel-collector-in-cluster-local patched application.argoproj.io/kueue-in-cluster-local patched application.argoproj.io/vector-tekton-logs-collector-in-cluster-local patched application.argoproj.io/enterprise-contract-in-cluster-local patched application.argoproj.io/repository-validator-in-cluster-local patched application.argoproj.io/all-application-sets patched application.argoproj.io/image-controller-in-cluster-local patched application.argoproj.io/knative-eventing-in-cluster-local patched application.argoproj.io/application-api-in-cluster-local patched application.argoproj.io/trust-manager-in-cluster-local patched application.argoproj.io/mintmaker-in-cluster-local patched application.argoproj.io/policies-in-cluster-local patched application.argoproj.io/kyverno-in-cluster-local patched application.argoproj.io/konflux-rbac-in-cluster-local patched application.argoproj.io/build-templates-in-cluster-local patched application.argoproj.io/dora-metrics-in-cluster-local patched application.argoproj.io/cert-manager-in-cluster-local patched application.argoproj.io/perf-team-prometheus-reader-in-cluster-local patched application.argoproj.io/multi-platform-controller-in-cluster-local patched application.argoproj.io/konflux-kite-in-cluster-local patched application.argoproj.io/release-in-cluster-local patched application.argoproj.io/crossplane-control-plane-in-cluster-local patched application.argoproj.io/project-controller-in-cluster-local patched application.argoproj.io/image-rbac-proxy-in-cluster-local patched application.argoproj.io/squid-in-cluster-local patched application.argoproj.io/tempo-in-cluster-local patched application.argoproj.io/monitoring-registry-in-cluster-local patched application.argoproj.io/vector-kubearchive-log-collector-in-cluster-local patched (no change) application.argoproj.io/monitoring-workload-prometheus-in-cluster-local patched application.argoproj.io/build-service-in-cluster-local patched application.argoproj.io/kubearchive-in-cluster-local patched application.argoproj.io/pipeline-service-in-cluster-local patched application.argoproj.io/monitoring-workload-grafana-in-cluster-local patched application.argoproj.io/tracing-workload-tracing-in-cluster-local patched application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing image-rbac-proxy-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Healthy kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-registry-in-cluster-local OutOfSync Healthy monitoring-workload-grafana-in-cluster-local OutOfSync Missing monitoring-workload-prometheus-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Degraded policies-in-cluster-local OutOfSync Healthy postgres Synced Progressing release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing image-rbac-proxy-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Healthy kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-registry-in-cluster-local OutOfSync Healthy monitoring-workload-grafana-in-cluster-local OutOfSync Missing monitoring-workload-prometheus-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Degraded policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing image-rbac-proxy-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Healthy kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-registry-in-cluster-local OutOfSync Healthy monitoring-workload-grafana-in-cluster-local OutOfSync Missing monitoring-workload-prometheus-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing monitoring-registry-in-cluster-local OutOfSync Healthy monitoring-workload-grafana-in-cluster-local OutOfSync Missing monitoring-workload-prometheus-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-registry-in-cluster-local OutOfSync Healthy monitoring-workload-grafana-in-cluster-local OutOfSync Missing monitoring-workload-prometheus-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-registry-in-cluster-local OutOfSync Healthy monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local Synced Progressing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Progressing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Progressing kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync application-api-in-cluster-local OutOfSync Missing build-service-in-cluster-local Synced Progressing cert-manager-in-cluster-local Synced Degraded crossplane-control-plane-in-cluster-local OutOfSync Healthy integration-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Healthy kyverno-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Healthy multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local Unknown Healthy vector-kubearchive-log-collector-in-cluster-local failed with: [{"lastTransitionTime":"2025-12-19T17:32:48Z","message":"Resource /ServiceAccount/product-kubearchive-logging/minio-sa appeared 2 times among application resources.","type":"RepeatedResourceWarning"}] Switched to branch 'main' Your branch is up to date with 'upstream/main'. I1219 17:32:49.721669 23276 common.go:283] got an error: exit status 1 - will retry in 10s [controller-runtime] log.SetLogger(...) was never called; logs will not be displayed. Detected at: > goroutine 90 [running]: > runtime/debug.Stack() > /usr/lib/golang/src/runtime/debug/stack.go:26 +0x5e > sigs.k8s.io/controller-runtime/pkg/log.eventuallyFulfillRoot() > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/log.go:60 +0xcd > sigs.k8s.io/controller-runtime/pkg/log.(*delegatingLogSink).WithName(0xc00013d680, {0x2f9c4d6, 0x14}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/deleg.go:147 +0x3e > github.com/go-logr/logr.Logger.WithName({{0x36f7f70, 0xc00013d680}, 0x0}, {0x2f9c4d6?, 0x0?}) > /opt/app-root/src/go/pkg/mod/github.com/go-logr/logr@v1.4.2/logr.go:345 +0x36 > sigs.k8s.io/controller-runtime/pkg/client.newClient(0xffffffffffffffff?, {0x0, 0xc0005233b0, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:129 +0xf1 > sigs.k8s.io/controller-runtime/pkg/client.New(0xc000287208?, {0x0, 0xc0005233b0, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:110 +0x7d > github.com/konflux-ci/e2e-tests/pkg/clients/kubernetes.NewAdminKubernetesClient() > /tmp/tmp.lbMmitGtEq/pkg/clients/kubernetes/client.go:157 +0xa5 > github.com/konflux-ci/e2e-tests/magefiles/installation.NewAppStudioInstallController() > /tmp/tmp.lbMmitGtEq/magefiles/installation/install.go:98 +0x31 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.InstallKonflux() > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/repos/common.go:267 +0x13 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.retry(0x32a5c38, 0x2, 0x2540be400) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/repos/common.go:286 +0xf9 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.init.func7(0x36c0b80?) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/repos/common.go:360 +0xae > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.ActionFunc.Execute(0xc?, 0x2f76eee?) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:279 +0x19 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Apply(...) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:315 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x5254a80, 0xc00025e008) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:348 +0xb3 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x524d560?, 0xc00119bc00?, 0x1f1a179?}, 0xc00025e008) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x5254ba0, 0xc00025e008) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:340 +0x2b > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x5256820?, 0x4295dc?, 0x52d9c80?}, 0xc00025e008) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).runLoadedCatalog(0x528c3f0, {0xc00016a008?, 0xc001277e60?, 0x47?}, 0xc00025e008) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:129 +0x119 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).RunRulesOfCategory(0x528c3f0, {0x2f71243, 0x2}, 0xc00025e008) > /tmp/tmp.lbMmitGtEq/magefiles/rulesengine/types.go:121 +0x1b4 > main.CI.TestE2E({}) > /tmp/tmp.lbMmitGtEq/magefiles/magefile.go:330 +0x18a > main.main.func19({0x0?, 0x0?}) > /tmp/tmp.lbMmitGtEq/magefiles/mage_output_file.go:827 +0xf > main.main.func12.1() > /tmp/tmp.lbMmitGtEq/magefiles/mage_output_file.go:302 +0x5b > created by main.main.func12 in goroutine 1 > /tmp/tmp.lbMmitGtEq/magefiles/mage_output_file.go:297 +0xbe W1219 17:32:59.723207 23276 install.go:178] folder /tmp/tmp.lbMmitGtEq/tmp/infra-deployments already exists... removing I1219 17:32:59.831885 23276 install.go:188] cloning 'https://github.com/redhat-appstudio/infra-deployments' with git ref 'refs/heads/main' Enumerating objects: 70893, done. Counting objects: 0% (1/7901) Counting objects: 1% (80/7901) Counting objects: 2% (159/7901) Counting objects: 3% (238/7901) Counting objects: 4% (317/7901) Counting objects: 5% (396/7901) Counting objects: 6% (475/7901) Counting objects: 7% (554/7901) Counting objects: 8% (633/7901) Counting objects: 9% (712/7901) Counting objects: 10% (791/7901) Counting objects: 11% (870/7901) Counting objects: 12% (949/7901) Counting objects: 13% (1028/7901) Counting objects: 14% (1107/7901) Counting objects: 15% (1186/7901) Counting objects: 16% (1265/7901) Counting objects: 17% (1344/7901) Counting objects: 18% (1423/7901) Counting objects: 19% (1502/7901) Counting objects: 20% (1581/7901) Counting objects: 21% (1660/7901) Counting objects: 22% (1739/7901) Counting objects: 23% (1818/7901) Counting objects: 24% (1897/7901) Counting objects: 25% (1976/7901) Counting objects: 26% (2055/7901) Counting objects: 27% (2134/7901) Counting objects: 28% (2213/7901) Counting objects: 29% (2292/7901) Counting objects: 30% (2371/7901) Counting objects: 31% (2450/7901) Counting objects: 32% (2529/7901) Counting objects: 33% (2608/7901) Counting objects: 34% (2687/7901) Counting objects: 35% (2766/7901) Counting objects: 36% (2845/7901) Counting objects: 37% (2924/7901) Counting objects: 38% (3003/7901) Counting objects: 39% (3082/7901) Counting objects: 40% (3161/7901) Counting objects: 41% (3240/7901) Counting objects: 42% (3319/7901) Counting objects: 43% (3398/7901) Counting objects: 44% (3477/7901) Counting objects: 45% (3556/7901) Counting objects: 46% (3635/7901) Counting objects: 47% (3714/7901) Counting objects: 48% (3793/7901) Counting objects: 49% (3872/7901) Counting objects: 50% (3951/7901) Counting objects: 51% (4030/7901) Counting objects: 52% (4109/7901) Counting objects: 53% (4188/7901) Counting objects: 54% (4267/7901) Counting objects: 55% (4346/7901) Counting objects: 56% (4425/7901) Counting objects: 57% (4504/7901) Counting objects: 58% (4583/7901) Counting objects: 59% (4662/7901) Counting objects: 60% (4741/7901) Counting objects: 61% (4820/7901) Counting objects: 62% (4899/7901) Counting objects: 63% (4978/7901) Counting objects: 64% (5057/7901) Counting objects: 65% (5136/7901) Counting objects: 66% (5215/7901) Counting objects: 67% (5294/7901) Counting objects: 68% (5373/7901) Counting objects: 69% (5452/7901) Counting objects: 70% (5531/7901) Counting objects: 71% (5610/7901) Counting objects: 72% (5689/7901) Counting objects: 73% (5768/7901) Counting objects: 74% (5847/7901) Counting objects: 75% (5926/7901) Counting objects: 76% (6005/7901) Counting objects: 77% (6084/7901) Counting objects: 78% (6163/7901) Counting objects: 79% (6242/7901) Counting objects: 80% (6321/7901) Counting objects: 81% (6400/7901) Counting objects: 82% (6479/7901) Counting objects: 83% (6558/7901) Counting objects: 84% (6637/7901) Counting objects: 85% (6716/7901) Counting objects: 86% (6795/7901) Counting objects: 87% (6874/7901) Counting objects: 88% (6953/7901) Counting objects: 89% (7032/7901) Counting objects: 90% (7111/7901) Counting objects: 91% (7190/7901) Counting objects: 92% (7269/7901) Counting objects: 93% (7348/7901) Counting objects: 94% (7427/7901) Counting objects: 95% (7506/7901) Counting objects: 96% (7585/7901) Counting objects: 97% (7664/7901) Counting objects: 98% (7743/7901) Counting objects: 99% (7822/7901) Counting objects: 100% (7901/7901) Counting objects: 100% (7901/7901), done. Compressing objects: 0% (1/470) Compressing objects: 1% (5/470) Compressing objects: 2% (10/470) Compressing objects: 3% (15/470) Compressing objects: 4% (19/470) Compressing objects: 5% (24/470) Compressing objects: 6% (29/470) Compressing objects: 7% (33/470) Compressing objects: 8% (38/470) Compressing objects: 9% (43/470) Compressing objects: 10% (47/470) Compressing objects: 11% (52/470) Compressing objects: 12% (57/470) Compressing objects: 13% (62/470) Compressing objects: 14% (66/470) Compressing objects: 15% (71/470) Compressing objects: 16% (76/470) Compressing objects: 17% (80/470) Compressing objects: 18% (85/470) Compressing objects: 19% (90/470) Compressing objects: 20% (94/470) Compressing objects: 21% (99/470) Compressing objects: 22% (104/470) Compressing objects: 23% (109/470) Compressing objects: 24% (113/470) Compressing objects: 25% (118/470) Compressing objects: 26% (123/470) Compressing objects: 27% (127/470) Compressing objects: 28% (132/470) Compressing objects: 29% (137/470) Compressing objects: 30% (141/470) Compressing objects: 31% (146/470) Compressing objects: 32% (151/470) Compressing objects: 33% (156/470) Compressing objects: 34% (160/470) Compressing objects: 35% (165/470) Compressing objects: 36% (170/470) Compressing objects: 37% (174/470) Compressing objects: 38% (179/470) Compressing objects: 39% (184/470) Compressing objects: 40% (188/470) Compressing objects: 41% (193/470) Compressing objects: 42% (198/470) Compressing objects: 43% (203/470) Compressing objects: 44% (207/470) Compressing objects: 45% (212/470) Compressing objects: 46% (217/470) Compressing objects: 47% (221/470) Compressing objects: 48% (226/470) Compressing objects: 49% (231/470) Compressing objects: 50% (235/470) Compressing objects: 51% (240/470) Compressing objects: 52% (245/470) Compressing objects: 53% (250/470) Compressing objects: 54% (254/470) Compressing objects: 55% (259/470) Compressing objects: 56% (264/470) Compressing objects: 57% (268/470) Compressing objects: 58% (273/470) Compressing objects: 59% (278/470) Compressing objects: 60% (282/470) Compressing objects: 61% (287/470) Compressing objects: 62% (292/470) Compressing objects: 63% (297/470) Compressing objects: 64% (301/470) Compressing objects: 65% (306/470) Compressing objects: 66% (311/470) Compressing objects: 67% (315/470) Compressing objects: 68% (320/470) Compressing objects: 69% (325/470) Compressing objects: 70% (329/470) Compressing objects: 71% (334/470) Compressing objects: 72% (339/470) Compressing objects: 73% (344/470) Compressing objects: 74% (348/470) Compressing objects: 75% (353/470) Compressing objects: 76% (358/470) Compressing objects: 77% (362/470) Compressing objects: 78% (367/470) Compressing objects: 79% (372/470) Compressing objects: 80% (376/470) Compressing objects: 81% (381/470) Compressing objects: 82% (386/470) Compressing objects: 83% (391/470) Compressing objects: 84% (395/470) Compressing objects: 85% (400/470) Compressing objects: 86% (405/470) Compressing objects: 87% (409/470) Compressing objects: 88% (414/470) Compressing objects: 89% (419/470) Compressing objects: 90% (423/470) Compressing objects: 91% (428/470) Compressing objects: 92% (433/470) Compressing objects: 93% (438/470) Compressing objects: 94% (442/470) Compressing objects: 95% (447/470) Compressing objects: 96% (452/470) Compressing objects: 97% (456/470) Compressing objects: 98% (461/470) Compressing objects: 99% (466/470) Compressing objects: 100% (470/470) Compressing objects: 100% (470/470), done. Total 70893 (delta 7547), reused 7437 (delta 7431), pack-reused 62992 (from 2) From https://github.com/redhat-appstudio/infra-deployments * branch main -> FETCH_HEAD Already up to date. Installing the OpenShift GitOps operator subscription: clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller unchanged clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server unchanged clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller unchanged clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server unchanged subscription.operators.coreos.com/openshift-gitops-operator unchanged Waiting for default project (and namespace) to exist: OK Waiting for OpenShift GitOps Route: OK argocd.argoproj.io/openshift-gitops patched (no change) argocd.argoproj.io/openshift-gitops patched (no change) Switch the Route to use re-encryption argocd.argoproj.io/openshift-gitops patched (no change) Restarting ArgoCD Server pod "openshift-gitops-server-867475867d-wfl8x" deleted Allow any authenticated users to be admin on the Argo CD instance argocd.argoproj.io/openshift-gitops patched (no change) Mark Pending PVC as Healthy, workaround for WaitForFirstConsumer StorageClasses. Warning: unknown field "spec.resourceCustomizations" argocd.argoproj.io/openshift-gitops patched (no change) Setting kustomize build options argocd.argoproj.io/openshift-gitops patched (no change) Setting ignore Aggregated Roles argocd.argoproj.io/openshift-gitops patched (no change) Setting ArgoCD tracking method to annotation argocd.argoproj.io/openshift-gitops patched (no change) Restarting GitOps server deployment.apps/openshift-gitops-server restarted ========================================================================= Argo CD URL is: https://openshift-gitops-server-openshift-gitops.apps.rosa.kx-e4b60de65d.54x8.p3.openshiftapps.com (NOTE: It may take a few moments for the route to become available) Waiting for the route: ...OK Login/password uses your OpenShift credentials ('Login with OpenShift' button) Setting secrets for Quality Dashboard namespace/quality-dashboard configured Creating secret for CI Helper App namespace/ci-helper-app configured Setting secrets for pipeline-service tekton-results namespace already exists, skipping creation tekton-logging namespace already exists, skipping creation product-kubearchive-logging namespace already exists, skipping creation Creating DB secret DB secret already exists, skipping creation Creating S3 secret S3 secret already exists, skipping creation Creating S3 secret S3 secret already exists, skipping creation Creating Postgres TLS certs Postgres DB cert secret already exists, skipping creation namespace/application-service configured Creating a has secret from legacy token secret/has-github-token configured Creating a secret with a token for Image Controller namespace/image-controller configured secret/quaytoken configured Configuring the cluster with a pull secret for Docker Hub Saved credentials for docker.io into /tmp/tmp.AXFYaqXr9i secret/pull-secret data updated Saved credentials for docker.io into /tmp/tmp.AXFYaqXr9i secret/docker-io-pull configured Setting secrets for Dora metrics exporter namespace/dora-metrics configured Setting Cluster Mode: preview Switched to a new branch 'preview-main-zrvq' labeling node/ip-10-0-137-98.ec2.internal... node/ip-10-0-137-98.ec2.internal not labeled successfully labeled node/ip-10-0-137-98.ec2.internal labeling node/ip-10-0-150-176.ec2.internal... node/ip-10-0-150-176.ec2.internal not labeled successfully labeled node/ip-10-0-150-176.ec2.internal labeling node/ip-10-0-161-116.ec2.internal... node/ip-10-0-161-116.ec2.internal not labeled successfully labeled node/ip-10-0-161-116.ec2.internal verifying labels... all nodes labeled successfully. Detected OCP minor version: 17 Changing AppStudio Gitlab Org to "redhat-appstudio-qe" [preview-main-zrvq 51bb8263f] Preview mode, do not merge into main 6 files changed, 12 insertions(+), 18 deletions(-) remote: remote: Create a pull request for 'preview-main-zrvq' on GitHub by visiting: remote: https://github.com/redhat-appstudio-qe/infra-deployments/pull/new/preview-main-zrvq remote: To https://github.com/redhat-appstudio-qe/infra-deployments.git * [new branch] preview-main-zrvq -> preview-main-zrvq branch 'preview-main-zrvq' set up to track 'qe/preview-main-zrvq'. application.argoproj.io/all-application-sets configured application.argoproj.io/mintmaker-in-cluster-local patched application.argoproj.io/image-controller-in-cluster-local patched application.argoproj.io/dora-metrics-in-cluster-local patched application.argoproj.io/build-service-in-cluster-local patched application.argoproj.io/image-rbac-proxy-in-cluster-local patched application.argoproj.io/all-application-sets patched application.argoproj.io/postgres patched application.argoproj.io/tempo-in-cluster-local patched application.argoproj.io/konflux-rbac-in-cluster-local patched application.argoproj.io/kyverno-in-cluster-local patched application.argoproj.io/konflux-kite-in-cluster-local patched application.argoproj.io/application-api-in-cluster-local patched application.argoproj.io/project-controller-in-cluster-local patched application.argoproj.io/internal-services-in-cluster-local patched application.argoproj.io/crossplane-control-plane-in-cluster-local patched application.argoproj.io/vector-kubearchive-log-collector-in-cluster-local patched (no change) application.argoproj.io/monitoring-registry-in-cluster-local patched application.argoproj.io/disable-csvcopy-in-cluster-local patched application.argoproj.io/monitoring-workload-prometheus-in-cluster-local patched application.argoproj.io/perf-team-prometheus-reader-in-cluster-local patched application.argoproj.io/build-templates-in-cluster-local patched application.argoproj.io/cert-manager-in-cluster-local patched application.argoproj.io/policies-in-cluster-local patched application.argoproj.io/repository-validator-in-cluster-local patched application.argoproj.io/knative-eventing-in-cluster-local patched application.argoproj.io/tracing-workload-otel-collector-in-cluster-local patched application.argoproj.io/vector-tekton-logs-collector-in-cluster-local patched application.argoproj.io/monitoring-workload-grafana-in-cluster-local patched application.argoproj.io/pipeline-service-in-cluster-local patched application.argoproj.io/has-in-cluster-local patched application.argoproj.io/kueue-in-cluster-local patched application.argoproj.io/squid-in-cluster-local patched application.argoproj.io/tracing-workload-tracing-in-cluster-local patched application.argoproj.io/integration-in-cluster-local patched application.argoproj.io/trust-manager-in-cluster-local patched application.argoproj.io/multi-platform-controller-in-cluster-local patched application.argoproj.io/kubearchive-in-cluster-local patched application.argoproj.io/release-in-cluster-local patched application.argoproj.io/enterprise-contract-in-cluster-local patched All Applications are synced and Healthy All required tekton resources are installed and ready Tekton CRDs are ready Setup Pac with existing QE sprayproxy and github App namespace/openshift-pipelines configured namespace/build-service configured namespace/integration-service configured secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created Configured pipelines-as-code-secret secret in openshift-pipelines namespace Switched to branch 'main' Your branch is up to date with 'upstream/main'. I1219 17:41:44.131406 23276 common.go:434] Registered PaC server: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-e4b60de65d.54x8.p3.openshiftapps.com I1219 17:41:44.197686 23276 common.go:459] The PaC servers registered in Sprayproxy: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-e4b60de65d.54x8.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-67c3bd1c04.l1ey.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-0c4f235e1e.4jx0.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-bb9474c68e.rzmi.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-9af15df48c.l72d.p3.openshiftapps.com I1219 17:41:44.197714 23276 common.go:475] going to create new Tekton bundle remote-build for the purpose of testing multi-platform-controller PR I1219 17:41:45.077645 23276 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.7@sha256:8b16e4e79853e3a3192f82e9f8930b79b04942bb389eaab4c44fb4d233ccefe6 I1219 17:41:45.080286 23276 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166104-igdl -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1219 17:41:46.760922 23276 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166104-igdl: quay.io/redhat-appstudio-qe/test-images@sha256:1c58231c70903796e3e13c83d1efed89aa4e4086ddb89b9548c9d15093ad2a70 I1219 17:41:46.760944 23276 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_ARM64 to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166104-igdl I1219 17:41:46.998314 23276 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.7@sha256:8b16e4e79853e3a3192f82e9f8930b79b04942bb389eaab4c44fb4d233ccefe6 I1219 17:41:47.000953 23276 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166106-kzpe -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1219 17:41:48.677616 23276 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166106-kzpe: quay.io/redhat-appstudio-qe/test-images@sha256:3ecf7e2fbf02e07153f90c92052a83c5308506f9c219967768986eefca7b70b9 I1219 17:41:48.677644 23276 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_S390X to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166106-kzpe I1219 17:41:48.933398 23276 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.7@sha256:8b16e4e79853e3a3192f82e9f8930b79b04942bb389eaab4c44fb4d233ccefe6 I1219 17:41:48.935407 23276 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166108-ojsq -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1219 17:41:50.608885 23276 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166108-ojsq: quay.io/redhat-appstudio-qe/test-images@sha256:bc871229447bfae7f2c888cb4cf63a2a750e098a28196a3a8931867e1e3182d8 I1219 17:41:50.608908 23276 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_PPC64LE to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1766166108-ojsq exec: ginkgo "--seed=1766165209" "--timeout=1h30m0s" "--grace-period=30s" "--output-interceptor-mode=none" "--no-color" "--json-report=e2e-report.json" "--junit-report=e2e-report.xml" "--procs=20" "--nodes=20" "--p" "--output-dir=/workspace/artifact-dir" "./cmd" "--" go: downloading github.com/konflux-ci/build-service v0.0.0-20240611083846-2dee6cfe6fe4 go: downloading github.com/IBM/vpc-go-sdk v0.48.0 go: downloading github.com/aws/aws-sdk-go-v2 v1.32.7 go: downloading github.com/aws/aws-sdk-go-v2/config v1.28.7 go: downloading github.com/IBM/go-sdk-core/v5 v5.15.3 go: downloading github.com/aws/aws-sdk-go-v2/service/ec2 v1.135.0 go: downloading github.com/go-playground/validator/v10 v10.17.0 go: downloading github.com/go-openapi/strfmt v0.22.0 go: downloading github.com/google/go-github/v45 v45.2.0 go: downloading go.mongodb.org/mongo-driver v1.13.1 go: downloading github.com/mitchellh/mapstructure v1.5.0 go: downloading github.com/go-openapi/errors v0.21.0 go: downloading github.com/oklog/ulid v1.3.1 go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 go: downloading github.com/aws/smithy-go v1.22.1 go: downloading github.com/leodido/go-urn v1.3.0 go: downloading github.com/gabriel-vasile/mimetype v1.4.3 go: downloading github.com/go-playground/universal-translator v0.18.1 go: downloading github.com/aws/aws-sdk-go-v2/credentials v1.17.48 go: downloading github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.22 go: downloading github.com/aws/aws-sdk-go-v2/internal/ini v1.8.1 go: downloading github.com/aws/aws-sdk-go-v2/service/sso v1.24.8 go: downloading github.com/aws/aws-sdk-go-v2/service/ssooidc v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/sts v1.33.3 go: downloading github.com/go-playground/locales v0.14.1 go: downloading github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.26 go: downloading github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.26 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.1 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.7 Running Suite: Red Hat App Studio E2E tests - /tmp/tmp.lbMmitGtEq/cmd ===================================================================== Random Seed: 1766165209 Will run 353 of 387 specs Running in parallel across 20 processes ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context points to a file [build-templates] /tmp/tmp.lbMmitGtEq/tests/build/tkn-bundle.go:177 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles from specific context [build-templates] /tmp/tmp.lbMmitGtEq/tests/build/tkn-bundle.go:188 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context is the root directory [build-templates] /tmp/tmp.lbMmitGtEq/tests/build/tkn-bundle.go:198 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when context points to a file and a directory [build-templates] /tmp/tmp.lbMmitGtEq/tests/build/tkn-bundle.go:207 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when using negation [build-templates] /tmp/tmp.lbMmitGtEq/tests/build/tkn-bundle.go:217 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding HOME environment variable [build-templates] /tmp/tmp.lbMmitGtEq/tests/build/tkn-bundle.go:227 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding STEP image [build-templates] /tmp/tmp.lbMmitGtEq/tests/build/tkn-bundle.go:236 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-host-pool] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:120 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-host-pool] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:124 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-host-pool] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:127 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-host-pool] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:148 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created test that cleanup happened successfully [multi-platform, aws-host-pool] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:152 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:251 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:255 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:259 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:263 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, aws-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:267 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmz-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:341 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmz-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:345 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmz-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:349 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmz-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:353 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmz-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:357 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmp-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:432 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmp-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:436 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmp-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:440 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmp-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:444 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmp-dynamic] /tmp/tmp.lbMmitGtEq/tests/build/multi-platform.go:448 ------------------------------ • [FAILED] [0.394 seconds] [release-pipelines-suite e2e tests for multi arch with rh-advisories pipeline] Multi arch test happy path [BeforeAll] Post-release verification verifies the release CR is created [release-pipelines, rh-advisories, multiarch-advisories, multiArchAdvisories] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/multiarch_advisories.go:61 [It] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/multiarch_advisories.go:113 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:24.334 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 12/19/25 17:43:24.421 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc000e0dc80>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000ea1a90>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000b15040>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:24.334 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ S ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if release CR is created [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/release_to_github.go:139 ------------------------------ SS ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies the release pipelinerun is running and succeeds [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/release_to_github.go:149 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies release CR completed and set succeeded. [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/release_to_github.go:182 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if the Release exists in github repo [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/release_to_github.go:193 ------------------------------ • [FAILED] [0.336 seconds] [release-pipelines-suite FBC e2e-tests] with FBC happy path [BeforeAll] Post-release verification creates component from git source https://github.com/redhat-appstudio-qe/fbc-sample-repo-test [release-pipelines, fbc-release, fbcHappyPath] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/fbc_release.go:89 [It] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/fbc_release.go:123 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:24.763 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 12/19/25 17:43:24.764 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc000b2e7b0>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000f80c30>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc0005dd720>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:24.763 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSSSSSSSSS ------------------------------ • [FAILED] [0.542 seconds] [release-pipelines-suite e2e tests for rh-push-to-redhat-io pipeline] Rh-push-to-redhat-io happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rh-push-to-registry-redhat-io, PushToRedhatIO] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rh_push_to_registry_redhat_io.go:61 [It] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rh_push_to_registry_redhat_io.go:110 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:24.759 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 12/19/25 17:43:24.973 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc000660690>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000c49720>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000c49400>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:24.759 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS ------------------------------ • [FAILED] [0.222 seconds] [release-pipelines-suite e2e tests for rh-advisories pipeline] Rh-advisories happy path [BeforeAll] Post-release verification verifies if release CR is created [release-pipelines, rh-advisories, rhAdvisories] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rh_advisories.go:61 [It] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rh_advisories.go:118 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:25.196 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 12/19/25 17:43:25.196 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc000661380>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000db41e0>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000d175e0>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 @ 12/19/25 17:43:25.196 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS ------------------------------ • [FAILED] [3.409 seconds] [release-pipelines-suite e2e tests for rhtap-service-push pipeline] Rhtap-service-push happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rhtap-service-push, RhtapServicePush] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rhtap_service_push.go:75 [It] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rhtap_service_push.go:150 Timeline >> PR #3671 got created with sha 281f3efc8e3c2c49c2823d60c5e2c7cc36deb596 merged result sha: 05ea358cf9730abd2efceedfe3af4d359bbe83d7 for PR #3671 [FAILED] in [BeforeAll] - /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rhtap_service_push.go:119 @ 12/19/25 17:43:27.941 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 12/19/25 17:43:27.942 << Timeline [FAILED] Unexpected error: <*fmt.wrapError | 0xc000b045c0>: failed to get API group resources: unable to retrieve the complete list of server APIs: appstudio.redhat.com/v1alpha1: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/apis/appstudio.redhat.com/v1alpha1": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { msg: "failed to get API group resources: unable to retrieve the complete list of server APIs: appstudio.redhat.com/v1alpha1: Get \"https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/apis/appstudio.redhat.com/v1alpha1\": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host", err: <*apiutil.ErrResourceDiscoveryFailed | 0xc000537750>{ { Group: "appstudio.redhat.com", Version: "v1alpha1", }: <*url.Error | 0xc0008e44b0>{ Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/apis/appstudio.redhat.com/v1alpha1", Err: <*net.OpError | 0xc0006a4410>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc0006a4280>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, }, }, } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rhtap_service_push.go:119 @ 12/19/25 17:43:27.941 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS• ------------------------------ • [FAILED] [24.185 seconds] [integration-service-suite Gitlab Status Reporting of Integration tests] Gitlab with status reporting of Integration tests in the assosiated merge request [BeforeAll] when a new Component with specified custom branch is created triggers a Build PipelineRun [integration-service, gitlab-status-reporting, custom-branch] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/integration-service/gitlab-integration-reporting.go:46 [It] /tmp/tmp.lbMmitGtEq/tests/integration-service/gitlab-integration-reporting.go:121 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.lbMmitGtEq/tests/integration-service/gitlab-integration-reporting.go:62 @ 12/19/25 17:43:47.769 [FAILED] in [AfterAll] - /tmp/tmp.lbMmitGtEq/tests/integration-service/gitlab-integration-reporting.go:88 @ 12/19/25 17:43:48.01 << Timeline [FAILED] Unexpected error: <*errors.StatusError | 0xc000f68960>: admission webhook "dintegrationtestscenario.kb.io" denied the request: could not find application 'integ-app-uocr' in namespace 'gitlab-rep-wviy' { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: { SelfLink: "", ResourceVersion: "", Continue: "", RemainingItemCount: nil, }, Status: "Failure", Message: "admission webhook \"dintegrationtestscenario.kb.io\" denied the request: could not find application 'integ-app-uocr' in namespace 'gitlab-rep-wviy'", Reason: "Forbidden", Details: nil, Code: 403, }, } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/integration-service/gitlab-integration-reporting.go:62 @ 12/19/25 17:43:47.769 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSSSSSSSSSSSSSS•• ------------------------------ • [FAILED] [31.890 seconds] [integration-service-suite Integration Service E2E tests] with happy path for general flow of Integration service [BeforeAll] when a new Component is created triggers a build PipelineRun [integration-service] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/integration-service/integration.go:52 [It] /tmp/tmp.lbMmitGtEq/tests/integration-service/integration.go:85 [FAILED] Unexpected error: <*errors.errorString | 0xc0006de0b0>: error when getting the base branch name 'onboarding' for the repo 'konflux-test-integration': GET https://api.github.com/repos/redhat-appstudio-qe/konflux-test-integration/git/ref/heads/onboarding: 404 Not Found [] { s: "error when getting the base branch name 'onboarding' for the repo 'konflux-test-integration': GET https://api.github.com/repos/redhat-appstudio-qe/konflux-test-integration/git/ref/heads/onboarding: 404 Not Found []", } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/integration-service/integration.go:437 @ 12/19/25 17:43:47.374 ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS• ------------------------------ • [FAILED] [32.640 seconds] [integration-service-suite Status Reporting of Integration tests] with status reporting of Integration tests in CheckRuns [BeforeAll] when a new Component with specified custom branch is created does not contain an annotation with a Snapshot Name [integration-service, github-status-reporting, custom-branch] [BeforeAll] /tmp/tmp.lbMmitGtEq/tests/integration-service/status-reporting-to-pullrequest.go:45 [It] /tmp/tmp.lbMmitGtEq/tests/integration-service/status-reporting-to-pullrequest.go:109 [FAILED] Unexpected error: <*errors.errorString | 0xc0006965b0>: error when getting the base branch name 'onboarding' for the repo 'konflux-test-integration-status-report': GET https://api.github.com/repos/redhat-appstudio-qe/konflux-test-integration-status-report/git/ref/heads/onboarding: 404 Not Found [] { s: "error when getting the base branch name 'onboarding' for the repo 'konflux-test-integration-status-report': GET https://api.github.com/repos/redhat-appstudio-qe/konflux-test-integration-status-report/git/ref/heads/onboarding: 404 Not Found []", } occurred In [BeforeAll] at: /tmp/tmp.lbMmitGtEq/tests/integration-service/integration.go:437 @ 12/19/25 17:43:49.401 ------------------------------ SSSSSSSSSSSSSSSSSSSSSS• ------------------------------ • [PANICKED] [44.446 seconds] [upgrade-suite Create users and check their state] [It] Verify AppStudioProvisionedUser [upgrade-verify] /tmp/tmp.lbMmitGtEq/tests/upgrade/verifyWorkload.go:20 Timeline >> "msg"="Observed a panic: \"invalid memory address or nil pointer dereference\" (runtime error: invalid memory address or nil pointer dereference)\ngoroutine 161 [running]:\nk8s.io/apimachinery/pkg/util/runtime.logPanic({0x2c4a9a0, 0x540d400})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:75 +0x85\nk8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc001c24000?})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:49 +0x65\npanic({0x2c4a9a0?, 0x540d400?})\n\t/usr/lib/golang/src/runtime/panic.go:792 +0x132\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp.func1()\n\t/tmp/tmp.lbMmitGtEq/pkg/sandbox/sandbox.go:319 +0x35\ngithub.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval.func1({0xee6b2800?, 0x0?})\n\t/tmp/tmp.lbMmitGtEq/pkg/utils/util.go:129 +0x13\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1(0xc0003c1280?, {0x38506d8?, 0xc000ac57a0?})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:53 +0x52\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x38506d8, 0xc000ac57a0}, {0x38451d0, 0xc0003c1280}, 0x1, 0x0, 0xc001716e68)\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:54 +0x115\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x3850588?, 0x54c3d60?}, 0xee6b2800, 0x419be5?, 0x1, 0xc001716e68)\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/poll.go:48 +0xa5\ngithub.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval(0xa?, 0xc000f94eb0?, 0x1?)\n\t/tmp/tmp.lbMmitGtEq/pkg/utils/util.go:129 +0x45\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp(0x323c06f?, {0x323c06f?, 0x3238ca7?}, 0x8?)\n\t/tmp/tmp.lbMmitGtEq/pkg/sandbox/sandbox.go:318 +0x72\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreated(0x0, {0x323c06f, 0x9})\n\t/tmp/tmp.lbMmitGtEq/pkg/sandbox/sandbox.go:314 +0x4b\ngithub.com/konflux-ci/e2e-tests/tests/upgrade/verify.VerifyAppStudioProvisionedUser(0x0?)\n\t/tmp/tmp.lbMmitGtEq/tests/upgrade/verify/verifyUsers.go:14 +0x25\ngithub.com/konflux-ci/e2e-tests/tests/upgrade.init.func1.2()\n\t/tmp/tmp.lbMmitGtEq/tests/upgrade/verifyWorkload.go:21 +0x1a\ngithub.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x7c2976?, 0xc001c1a900?})\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/node.go:475 +0x13\ngithub.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func3()\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/suite.go:894 +0x7b\ncreated by github.com/onsi/ginkgo/v2/internal.(*Suite).runNode in goroutine 62\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/suite.go:881 +0xd7b" "error"=null [PANICKED] in [It] - /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 @ 12/19/25 17:44:08.677 << Timeline [PANICKED] Test Panicked In [It] at: /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 @ 12/19/25 17:44:08.677 runtime error: invalid memory address or nil pointer dereference Full Stack Trace k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc001c24000?}) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 +0xc7 panic({0x2c4a9a0?, 0x540d400?}) /usr/lib/golang/src/runtime/panic.go:792 +0x132 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp.func1() /tmp/tmp.lbMmitGtEq/pkg/sandbox/sandbox.go:319 +0x35 github.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval.func1({0xee6b2800?, 0x0?}) /tmp/tmp.lbMmitGtEq/pkg/utils/util.go:129 +0x13 k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1(0xc0003c1280?, {0x38506d8?, 0xc000ac57a0?}) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:53 +0x52 k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x38506d8, 0xc000ac57a0}, {0x38451d0, 0xc0003c1280}, 0x1, 0x0, 0xc001721e68) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:54 +0x115 k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x3850588?, 0x54c3d60?}, 0xee6b2800, 0x419be5?, 0x1, 0xc001716e68) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/poll.go:48 +0xa5 github.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval(0xa?, 0xc000f94eb0?, 0x1?) /tmp/tmp.lbMmitGtEq/pkg/utils/util.go:129 +0x45 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp(0x323c06f?, {0x323c06f?, 0x3238ca7?}, 0x8?) /tmp/tmp.lbMmitGtEq/pkg/sandbox/sandbox.go:318 +0x72 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreated(0x0, {0x323c06f, 0x9}) /tmp/tmp.lbMmitGtEq/pkg/sandbox/sandbox.go:314 +0x4b github.com/konflux-ci/e2e-tests/tests/upgrade/verify.VerifyAppStudioProvisionedUser(0x0?) /tmp/tmp.lbMmitGtEq/tests/upgrade/verify/verifyUsers.go:14 +0x25 github.com/konflux-ci/e2e-tests/tests/upgrade.init.func1.2() /tmp/tmp.lbMmitGtEq/tests/upgrade/verifyWorkload.go:21 +0x1a ------------------------------ SS•••••••••••••••••••••••••••••••••••••••••••••••••••••••••• ------------------------------ • [FAILED] [0.646 seconds] [release-pipelines-suite [HACBS-1571]test-release-e2e-push-image-to-pyxis] Post-release verification [It] validate the result of task create-pyxis-image contains image ids [release-pipelines, rh-push-to-external-registry] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rh_push_to_external_registry.go:233 [FAILED] Unexpected error: <*errors.errorString | 0xc000910df0>: task with create-pyxis-image name doesn't exist in managed-czwl7 pipelinerun { s: "task with create-pyxis-image name doesn't exist in managed-czwl7 pipelinerun", } occurred In [It] at: /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rh_push_to_external_registry.go:236 @ 12/19/25 17:46:39.566 ------------------------------ SS••• ------------------------------ P [PENDING] [build-service-suite Build service E2E tests] test build secret lookup when two secrets are created when second component is deleted, pac pr branch should not exist in the repo [build-service, pac-build, secret-lookup] /tmp/tmp.lbMmitGtEq/tests/build/build.go:1121 ------------------------------ •••••••••S•S• ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.lbMmitGtEq/tests/build/build_templates.go:489 ------------------------------ •••••• ------------------------------ • [FAILED] [1.470 seconds] [build-service-suite Build service E2E tests] test PaC component build github when a new Component with specified custom branch is created [It] eventually leads to the PipelineRun status report at Checks tab [build-service, github-webhook, pac-build, pipeline, image-controller, build-custom-branch] /tmp/tmp.lbMmitGtEq/tests/build/build.go:449 [FAILED] Expected : failure to equal : success In [It] at: /tmp/tmp.lbMmitGtEq/tests/build/build.go:453 @ 12/19/25 17:52:02.719 ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS•S•••••••••••S•S• ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build-oci-ta should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.lbMmitGtEq/tests/build/build_templates.go:489 ------------------------------ •••S••••••••••••••••••• ------------------------------ • [FAILED] [315.738 seconds] [build-service-suite Build service E2E tests] test of component update with renovate github when components are created in same namespace [It] PR merge triggers PAC PipelineRun for parent component [build-service, renovate, multi-component] /tmp/tmp.lbMmitGtEq/tests/build/build.go:1580 Timeline >> Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Push PipelineRun has not been created yet for the component build-e2e-ktzy/gh-multi-component-parent-vyxk [FAILED] in [It] - /tmp/tmp.lbMmitGtEq/tests/build/build.go:1593 @ 12/19/25 17:57:43.73 << Timeline [FAILED] Timed out after 300.000s. timed out when waiting for the PipelineRun to start for the component build-e2e-ktzy/gh-multi-component-parent-vyxk Expected success, but got an error: <*errors.errorString | 0xc000e104e0>: no pipelinerun found for component gh-multi-component-parent-vyxk { s: "no pipelinerun found for component gh-multi-component-parent-vyxk", } In [It] at: /tmp/tmp.lbMmitGtEq/tests/build/build.go:1593 @ 12/19/25 17:57:43.73 ------------------------------ SSSSSSSSSSSSSSSSSS ------------------------------ • [FAILED] [300.872 seconds] [build-service-suite Build service E2E tests] test pac with multiple components using same repository when components are created in same namespace [It] leads to triggering on push PipelineRun [build-service, pac-build, multi-component] /tmp/tmp.lbMmitGtEq/tests/build/build.go:834 Timeline >> Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb Push PipelineRun has not been created yet for the component build-e2e-gxdy/go-component-wxuujb [FAILED] in [It] - /tmp/tmp.lbMmitGtEq/tests/build/build.go:847 @ 12/19/25 17:58:29.348 << Timeline [FAILED] Timed out after 300.001s. timed out when waiting for the PipelineRun to start for the component build-e2e-gxdy/go-component-wxuujb Expected success, but got an error: <*errors.errorString | 0xc0017100f0>: no pipelinerun found for component go-component-wxuujb { s: "no pipelinerun found for component go-component-wxuujb", } In [It] at: /tmp/tmp.lbMmitGtEq/tests/build/build.go:847 @ 12/19/25 17:58:29.348 ------------------------------ SSSSSSSSS ------------------------------ • [FAILED] [600.067 seconds] [konflux-demo-suite] Maven project - Default build when push pipelinerun is retriggered [It] should eventually succeed [konflux] /tmp/tmp.lbMmitGtEq/tests/konflux-demo/konflux-demo.go:329 Timeline >> PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast PipelineRun is not been retriggered yet for the component konflux-hvmt/konflux-demo-component-dast [FAILED] in [It] - /tmp/tmp.lbMmitGtEq/tests/konflux-demo/konflux-demo.go:342 @ 12/19/25 18:03:20.279 << Timeline [FAILED] Timed out after 600.001s. timed out when waiting for the PipelineRun to retrigger for the component konflux-hvmt/konflux-demo-component-dast Expected success, but got an error: <*errors.errorString | 0xc0015d3080>: no pipelinerun found for component konflux-demo-component-dast { s: "no pipelinerun found for component konflux-demo-component-dast", } In [It] at: /tmp/tmp.lbMmitGtEq/tests/konflux-demo/konflux-demo.go:342 @ 12/19/25 18:03:20.279 ------------------------------ SSSSSS ------------------------------ • [FAILED] [901.383 seconds] [integration-service-suite Creation of group snapshots for monorepo and multiple repos] with status reporting of Integration tests in CheckRuns when we start creation of a new Component B [It] triggers a Build PipelineRun for component python-component [integration-service, group-snapshot-creation] /tmp/tmp.lbMmitGtEq/tests/integration-service/group-snapshots-tests.go:235 Timeline >> Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg Build PipelineRun has not been created yet for the componentB group-gtzp/python-component-fsrfcg [FAILED] in [It] - /tmp/tmp.lbMmitGtEq/tests/integration-service/group-snapshots-tests.go:246 @ 12/19/25 18:08:43.915 << Timeline [FAILED] Timed out after 900.001s. timed out when waiting for the build PipelineRun to start for the componentB group-gtzp/python-component-fsrfcg Expected success, but got an error: <*errors.errorString | 0xc000ede500>: no pipelinerun found for component python-component-fsrfcg { s: "no pipelinerun found for component python-component-fsrfcg", } In [It] at: /tmp/tmp.lbMmitGtEq/tests/integration-service/group-snapshots-tests.go:246 @ 12/19/25 18:08:43.915 ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSS Summarizing 15 Failures: [FAIL] [integration-service-suite Gitlab Status Reporting of Integration tests] Gitlab with status reporting of Integration tests in the assosiated merge request [BeforeAll] when a new Component with specified custom branch is created triggers a Build PipelineRun [integration-service, gitlab-status-reporting, custom-branch] /tmp/tmp.lbMmitGtEq/tests/integration-service/gitlab-integration-reporting.go:62 [FAIL] [integration-service-suite Integration Service E2E tests] with happy path for general flow of Integration service [BeforeAll] when a new Component is created triggers a build PipelineRun [integration-service] /tmp/tmp.lbMmitGtEq/tests/integration-service/integration.go:437 [FAIL] [integration-service-suite Status Reporting of Integration tests] with status reporting of Integration tests in CheckRuns [BeforeAll] when a new Component with specified custom branch is created does not contain an annotation with a Snapshot Name [integration-service, github-status-reporting, custom-branch] /tmp/tmp.lbMmitGtEq/tests/integration-service/integration.go:437 [PANICKED!] [upgrade-suite Create users and check their state] [It] Verify AppStudioProvisionedUser [upgrade-verify] /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 [FAIL] [release-pipelines-suite e2e tests for rhtap-service-push pipeline] Rhtap-service-push happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rhtap-service-push, RhtapServicePush] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rhtap_service_push.go:119 [FAIL] [release-pipelines-suite FBC e2e-tests] with FBC happy path [BeforeAll] Post-release verification creates component from git source https://github.com/redhat-appstudio-qe/fbc-sample-repo-test [release-pipelines, fbc-release, fbcHappyPath] /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 [FAIL] [release-pipelines-suite [HACBS-1571]test-release-e2e-push-image-to-pyxis] Post-release verification [It] validate the result of task create-pyxis-image contains image ids [release-pipelines, rh-push-to-external-registry] /tmp/tmp.lbMmitGtEq/tests/release/pipelines/rh_push_to_external_registry.go:236 [FAIL] [release-pipelines-suite e2e tests for multi arch with rh-advisories pipeline] Multi arch test happy path [BeforeAll] Post-release verification verifies the release CR is created [release-pipelines, rh-advisories, multiarch-advisories, multiArchAdvisories] /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 [FAIL] [build-service-suite Build service E2E tests] test PaC component build github when a new Component with specified custom branch is created [It] eventually leads to the PipelineRun status report at Checks tab [build-service, github-webhook, pac-build, pipeline, image-controller, build-custom-branch] /tmp/tmp.lbMmitGtEq/tests/build/build.go:453 [FAIL] [build-service-suite Build service E2E tests] test of component update with renovate github when components are created in same namespace [It] PR merge triggers PAC PipelineRun for parent component [build-service, renovate, multi-component] /tmp/tmp.lbMmitGtEq/tests/build/build.go:1593 [FAIL] [build-service-suite Build service E2E tests] test pac with multiple components using same repository when components are created in same namespace [It] leads to triggering on push PipelineRun [build-service, pac-build, multi-component] /tmp/tmp.lbMmitGtEq/tests/build/build.go:847 [FAIL] [konflux-demo-suite] Maven project - Default build when push pipelinerun is retriggered [It] should eventually succeed [konflux] /tmp/tmp.lbMmitGtEq/tests/konflux-demo/konflux-demo.go:342 [FAIL] [release-pipelines-suite e2e tests for rh-push-to-redhat-io pipeline] Rh-push-to-redhat-io happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rh-push-to-registry-redhat-io, PushToRedhatIO] /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 [FAIL] [release-pipelines-suite e2e tests for rh-advisories pipeline] Rh-advisories happy path [BeforeAll] Post-release verification verifies if release CR is created [release-pipelines, rh-advisories, rhAdvisories] /tmp/tmp.lbMmitGtEq/tests/release/releaseLib.go:322 [FAIL] [integration-service-suite Creation of group snapshots for monorepo and multiple repos] with status reporting of Integration tests in CheckRuns when we start creation of a new Component B [It] triggers a Build PipelineRun for component python-component [integration-service, group-snapshot-creation] /tmp/tmp.lbMmitGtEq/tests/integration-service/group-snapshots-tests.go:246 Ran 134 of 387 Specs in 1521.473 seconds FAIL! -- 119 Passed | 15 Failed | 34 Pending | 219 Skipped Ginkgo ran 1 suite in 26m54.808774381s Test Suite Failed Error: running "ginkgo --seed=1766165209 --timeout=1h30m0s --grace-period=30s --output-interceptor-mode=none --no-color --json-report=e2e-report.json --junit-report=e2e-report.xml --procs=20 --nodes=20 --p --output-dir=/workspace/artifact-dir ./cmd --" failed with exit code 1 make: *** [Makefile:25: ci/test/e2e] Error 1