./mage -v ci:teste2e go: downloading github.com/google/go-cmp v0.7.0 go: downloading golang.org/x/net v0.38.0 go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20260122232226-8e98ce8d340d go: downloading google.golang.org/protobuf v1.36.11 go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20260120174246-409b4a993575 go: downloading google.golang.org/grpc v1.71.0 Running target: CI:TestE2E I0123 01:45:42.129744 21569 magefile.go:529] setting up new custom bundle for testing... I0123 01:45:42.491087 21569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769132742-dpyi -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: docker-build to image I0123 01:45:44.041324 21569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769132742-dpyi: quay.io/redhat-appstudio-qe/test-images@sha256:31e101a59c850d386db59e0277d57ed20f1964dfcaaad3dc10b7ec03a46fced0 I0123 01:45:44.041346 21569 magefile.go:535] To use the custom docker bundle locally, run below cmd: export CUSTOM_DOCKER_BUILD_PIPELINE_BUNDLE=quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769132742-dpyi I0123 01:45:44.041364 21569 e2e_repo.go:347] checking if repository is e2e-tests I0123 01:45:44.041368 21569 e2e_repo.go:335] multi-platform tests and require sprayproxy registering are set to TRUE exec: git "diff" "--name-status" "upstream/main..HEAD" I0123 01:45:44.044272 21569 util.go:451] The following files, go.mod, go.sum, were changed! exec: go "install" "-mod=mod" "github.com/onsi/ginkgo/v2/ginkgo" go: downloading github.com/go-task/slim-sprig/v3 v3.0.0 go: downloading github.com/google/pprof v0.0.0-20241210010833-40e02aabc2ad I0123 01:45:47.260337 21569 install.go:188] cloning 'https://github.com/redhat-appstudio/infra-deployments' with git ref 'refs/heads/main' Enumerating objects: 72646, done. Counting objects: 1% (1/84) Counting objects: 2% (2/84) Counting objects: 3% (3/84) Counting objects: 4% (4/84) Counting objects: 5% (5/84) Counting objects: 7% (6/84) Counting objects: 8% (7/84) Counting objects: 9% (8/84) Counting objects: 10% (9/84) Counting objects: 11% (10/84) Counting objects: 13% (11/84) Counting objects: 14% (12/84) Counting objects: 15% (13/84) Counting objects: 16% (14/84) Counting objects: 17% (15/84) Counting objects: 19% (16/84) Counting objects: 20% (17/84) Counting objects: 21% (18/84) Counting objects: 22% (19/84) Counting objects: 23% (20/84) Counting objects: 25% (21/84) Counting objects: 26% (22/84) Counting objects: 27% (23/84) Counting objects: 28% (24/84) Counting objects: 29% (25/84) Counting objects: 30% (26/84) Counting objects: 32% (27/84) Counting objects: 33% (28/84) Counting objects: 34% (29/84) Counting objects: 35% (30/84) Counting objects: 36% (31/84) Counting objects: 38% (32/84) Counting objects: 39% (33/84) Counting objects: 40% (34/84) Counting objects: 41% (35/84) Counting objects: 42% (36/84) Counting objects: 44% (37/84) Counting objects: 45% (38/84) Counting objects: 46% (39/84) Counting objects: 47% (40/84) Counting objects: 48% (41/84) Counting objects: 50% (42/84) Counting objects: 51% (43/84) Counting objects: 52% (44/84) Counting objects: 53% (45/84) Counting objects: 54% (46/84) Counting objects: 55% (47/84) Counting objects: 57% (48/84) Counting objects: 58% (49/84) Counting objects: 59% (50/84) Counting objects: 60% (51/84) Counting objects: 61% (52/84) Counting objects: 63% (53/84) Counting objects: 64% (54/84) Counting objects: 65% (55/84) Counting objects: 66% (56/84) Counting objects: 67% (57/84) Counting objects: 69% (58/84) Counting objects: 70% (59/84) Counting objects: 71% (60/84) Counting objects: 72% (61/84) Counting objects: 73% (62/84) Counting objects: 75% (63/84) Counting objects: 76% (64/84) Counting objects: 77% (65/84) Counting objects: 78% (66/84) Counting objects: 79% (67/84) Counting objects: 80% (68/84) Counting objects: 82% (69/84) Counting objects: 83% (70/84) Counting objects: 84% (71/84) Counting objects: 85% (72/84) Counting objects: 86% (73/84) Counting objects: 88% (74/84) Counting objects: 89% (75/84) Counting objects: 90% (76/84) Counting objects: 91% (77/84) Counting objects: 92% (78/84) Counting objects: 94% (79/84) Counting objects: 95% (80/84) Counting objects: 96% (81/84) Counting objects: 97% (82/84) Counting objects: 98% (83/84) Counting objects: 100% (84/84) Counting objects: 100% (84/84), done. Compressing objects: 1% (1/53) Compressing objects: 3% (2/53) Compressing objects: 5% (3/53) Compressing objects: 7% (4/53) Compressing objects: 9% (5/53) Compressing objects: 11% (6/53) Compressing objects: 13% (7/53) Compressing objects: 15% (8/53) Compressing objects: 16% (9/53) Compressing objects: 18% (10/53) Compressing objects: 20% (11/53) Compressing objects: 22% (12/53) Compressing objects: 24% (13/53) Compressing objects: 26% (14/53) Compressing objects: 28% (15/53) Compressing objects: 30% (16/53) Compressing objects: 32% (17/53) Compressing objects: 33% (18/53) Compressing objects: 35% (19/53) Compressing objects: 37% (20/53) Compressing objects: 39% (21/53) Compressing objects: 41% (22/53) Compressing objects: 43% (23/53) Compressing objects: 45% (24/53) Compressing objects: 47% (25/53) Compressing objects: 49% (26/53) Compressing objects: 50% (27/53) Compressing objects: 52% (28/53) Compressing objects: 54% (29/53) Compressing objects: 56% (30/53) Compressing objects: 58% (31/53) Compressing objects: 60% (32/53) Compressing objects: 62% (33/53) Compressing objects: 64% (34/53) Compressing objects: 66% (35/53) Compressing objects: 67% (36/53) Compressing objects: 69% (37/53) Compressing objects: 71% (38/53) Compressing objects: 73% (39/53) Compressing objects: 75% (40/53) Compressing objects: 77% (41/53) Compressing objects: 79% (42/53) Compressing objects: 81% (43/53) Compressing objects: 83% (44/53) Compressing objects: 84% (45/53) Compressing objects: 86% (46/53) Compressing objects: 88% (47/53) Compressing objects: 90% (48/53) Compressing objects: 92% (49/53) Compressing objects: 94% (50/53) Compressing objects: 96% (51/53) Compressing objects: 98% (52/53) Compressing objects: 100% (53/53) Compressing objects: 100% (53/53), done. Total 72646 (delta 49), reused 31 (delta 31), pack-reused 72562 (from 4) From https://github.com/redhat-appstudio/infra-deployments * branch main -> FETCH_HEAD Already up to date. Installing the OpenShift GitOps operator subscription: clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created subscription.operators.coreos.com/openshift-gitops-operator created Waiting for default project (and namespace) to exist: ....................................OK Waiting for OpenShift GitOps Route: OK argocd.argoproj.io/openshift-gitops patched argocd.argoproj.io/openshift-gitops patched Switch the Route to use re-encryption argocd.argoproj.io/openshift-gitops patched Restarting ArgoCD Server pod "openshift-gitops-server-78868c5878-d9knl" deleted Allow any authenticated users to be admin on the Argo CD instance argocd.argoproj.io/openshift-gitops patched Mark Pending PVC as Healthy, workaround for WaitForFirstConsumer StorageClasses. Warning: unknown field "spec.resourceCustomizations" argocd.argoproj.io/openshift-gitops patched (no change) Setting kustomize build options argocd.argoproj.io/openshift-gitops patched Setting ignore Aggregated Roles argocd.argoproj.io/openshift-gitops patched Setting ArgoCD tracking method to annotation argocd.argoproj.io/openshift-gitops patched Restarting GitOps server deployment.apps/openshift-gitops-server restarted ========================================================================= Argo CD URL is: https://openshift-gitops-server-openshift-gitops.apps.rosa.kx-53555a9322.pkqy.p3.openshiftapps.com (NOTE: It may take a few moments for the route to become available) Waiting for the route: .........OK Login/password uses your OpenShift credentials ('Login with OpenShift' button) Setting secrets for Quality Dashboard namespace/quality-dashboard created secret/quality-dashboard-secrets created Creating secret for CI Helper App namespace/ci-helper-app created secret/ci-helper-app-secrets created Setting secrets for pipeline-service tekton-results namespace already exists, skipping creation tekton-logging namespace already exists, skipping creation namespace/product-kubearchive-logging created Creating DB secret secret/tekton-results-database created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config secret/minio-storage-configuration created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config MinIO config already exists, skipping creation Creating Postgres TLS certs ...+.+...+..+.+...+.........+...+.................+......+.......+..+++++++++++++++++++++++++++++++++++++++*........+.....+++++++++++++++++++++++++++++++++++++++*..+...+.....+..........+..+...+.+...++++++ ..+........+++++++++++++++++++++++++++++++++++++++*...+..+...+...+....+..............+.+...+...+..+......+..........+++++++++++++++++++++++++++++++++++++++*.............+.+..............+...+...+.+...+..+......+...+.+...+...+.....+.........+.+..+.......+.......................+.+..+...+..........+........................+..+...+......+............+..........+...+.........+..+..........+...+.....+.+......+...............+.....+.+..+...+....+...+.....+...+....+.....+......+.......+........+....+...+.....+...+.......+...+................................+.......+.....+...+..........+......++++++ ----- Certificate request self-signature ok subject=CN=cluster.local .....+........+.......+...+..+.+..+............+..........+......+++++++++++++++++++++++++++++++++++++++*...+++++++++++++++++++++++++++++++++++++++*..+...+............+.+......+..............+.+...+............+...+...+.....+.......+.....+.......+...+.....+.+......+........+.+.........+..+...+.+.....+......+.........+.......+.....+..........+...............+.....+....+.....+...+.......+......+.........+...+..+....+..............+..........++++++ .+...+++++++++++++++++++++++++++++++++++++++*...+.....+.......+...+...+............+...+...+..+....+.....+......+++++++++++++++++++++++++++++++++++++++*.........+........+.......+........................+...+.....................+..+............+...+...................+...+............+......+.........+......+.....+...+......+.........+.........+.+.........+...............+......+.........+.....+.+......+......+............+..+......+.......+..+.+...+.....+....+...+..+............+...+.+............+.....+.............+......+........+.+..+....+...........+...+.+...+......+............+...+.........+..+.........+.......+..+.+.........+...+..............+...+...+...+......+...+.......+.........+..+.............+..............+.+......+...............+.........+....................+...................+...+...+.....+....+..+.+..+............+.........+.+......+..+.+......+...+.....+.......++++++ ----- Certificate request self-signature ok subject=CN=postgres-postgresql.tekton-results.svc.cluster.local secret/postgresql-tls created configmap/rds-root-crt created namespace/application-service created Creating a has secret from legacy token secret/has-github-token created Creating a secret with a token for Image Controller namespace/image-controller created secret/quaytoken created Configuring the cluster with a pull secret for Docker Hub Saved credentials for docker.io into /tmp/tmp.MCENc46o09 secret/pull-secret data updated Saved credentials for docker.io into /tmp/tmp.MCENc46o09 secret/docker-io-pull created Setting secrets for Dora metrics exporter namespace/dora-metrics created secret/exporters-secret created Setting Cluster Mode: preview Switched to a new branch 'preview-main-jpnx' labeling node/ip-10-0-136-72.ec2.internal... node/ip-10-0-136-72.ec2.internal labeled successfully labeled node/ip-10-0-136-72.ec2.internal labeling node/ip-10-0-158-228.ec2.internal... node/ip-10-0-158-228.ec2.internal labeled successfully labeled node/ip-10-0-158-228.ec2.internal labeling node/ip-10-0-170-42.ec2.internal... node/ip-10-0-170-42.ec2.internal labeled successfully labeled node/ip-10-0-170-42.ec2.internal verifying labels... all nodes labeled successfully. Detected OCP minor version: 17 Changing AppStudio Gitlab Org to "redhat-appstudio-qe" [preview-main-jpnx 5a40b4f47] Preview mode, do not merge into main 6 files changed, 12 insertions(+), 18 deletions(-) remote: remote: Create a pull request for 'preview-main-jpnx' on GitHub by visiting: remote: https://github.com/redhat-appstudio-qe/infra-deployments/pull/new/preview-main-jpnx remote: To https://github.com/redhat-appstudio-qe/infra-deployments.git * [new branch] preview-main-jpnx -> preview-main-jpnx branch 'preview-main-jpnx' set up to track 'qe/preview-main-jpnx'. application.argoproj.io/all-application-sets created Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app application.argoproj.io/kyverno-in-cluster-local patched application.argoproj.io/knative-eventing-in-cluster-local patched application.argoproj.io/vector-tekton-logs-collector-in-cluster-local patched application.argoproj.io/disable-csvcopy-in-cluster-local patched application.argoproj.io/monitoring-registry-in-cluster-local patched application.argoproj.io/internal-services-in-cluster-local patched application.argoproj.io/image-controller-in-cluster-local patched application.argoproj.io/konflux-kite-in-cluster-local patched application.argoproj.io/image-rbac-proxy-in-cluster-local patched application.argoproj.io/trust-manager-in-cluster-local patched application.argoproj.io/crossplane-control-plane-in-cluster-local patched application.argoproj.io/monitoring-workload-prometheus-in-cluster-local patched application.argoproj.io/repository-validator-in-cluster-local patched application.argoproj.io/vector-kubearchive-log-collector-in-cluster-local patched (no change) application.argoproj.io/tracing-workload-tracing-in-cluster-local patched application.argoproj.io/all-application-sets patched application.argoproj.io/release-in-cluster-local patched application.argoproj.io/tempo-in-cluster-local patched application.argoproj.io/cert-manager-in-cluster-local patched application.argoproj.io/integration-in-cluster-local patched application.argoproj.io/kubearchive-in-cluster-local patched application.argoproj.io/project-controller-in-cluster-local patched application.argoproj.io/kueue-in-cluster-local patched application.argoproj.io/enterprise-contract-in-cluster-local patched application.argoproj.io/squid-in-cluster-local patched application.argoproj.io/application-api-in-cluster-local patched application.argoproj.io/monitoring-workload-grafana-in-cluster-local patched application.argoproj.io/dora-metrics-in-cluster-local patched application.argoproj.io/perf-team-prometheus-reader-in-cluster-local patched application.argoproj.io/konflux-rbac-in-cluster-local patched application.argoproj.io/policies-in-cluster-local patched application.argoproj.io/tracing-workload-otel-collector-in-cluster-local patched application.argoproj.io/multi-platform-controller-in-cluster-local patched application.argoproj.io/pipeline-service-in-cluster-local patched application.argoproj.io/has-in-cluster-local patched application.argoproj.io/build-templates-in-cluster-local patched application.argoproj.io/mintmaker-in-cluster-local patched application.argoproj.io/build-service-in-cluster-local patched kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync kueue-in-cluster-local OutOfSync Healthy pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync pipeline-service-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync trust-manager-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync All Applications are synced and Healthy All required tekton resources are installed and ready Tekton CRDs are ready Setup Pac with existing QE sprayproxy and github App namespace/openshift-pipelines configured namespace/build-service configured namespace/integration-service configured secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created Configured pipelines-as-code-secret secret in openshift-pipelines namespace Switched to branch 'main' Your branch is up to date with 'upstream/main'. [controller-runtime] log.SetLogger(...) was never called; logs will not be displayed. Detected at: > goroutine 94 [running]: > runtime/debug.Stack() > /usr/lib/golang/src/runtime/debug/stack.go:26 +0x5e > sigs.k8s.io/controller-runtime/pkg/log.eventuallyFulfillRoot() > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/log.go:60 +0xcd > sigs.k8s.io/controller-runtime/pkg/log.(*delegatingLogSink).WithName(0xc0005457c0, {0x2fa26be, 0x14}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/deleg.go:147 +0x3e > github.com/go-logr/logr.Logger.WithName({{0x3701eb0, 0xc0005457c0}, 0x0}, {0x2fa26be?, 0x0?}) > /opt/app-root/src/go/pkg/mod/github.com/go-logr/logr@v1.4.2/logr.go:345 +0x36 > sigs.k8s.io/controller-runtime/pkg/client.newClient(0x2d7dce0?, {0x0, 0xc0005eefc0, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:129 +0xf1 > sigs.k8s.io/controller-runtime/pkg/client.New(0xc000144248?, {0x0, 0xc0005eefc0, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:110 +0x7d > github.com/konflux-ci/e2e-tests/pkg/clients/kubernetes.NewAdminKubernetesClient() > /tmp/tmp.ai5Z8MqDrc/pkg/clients/kubernetes/client.go:157 +0xa5 > github.com/konflux-ci/e2e-tests/pkg/clients/sprayproxy.GetPaCHost() > /tmp/tmp.ai5Z8MqDrc/pkg/clients/sprayproxy/sprayproxy.go:93 +0x1c > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.registerPacServer() > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/repos/common.go:426 +0x78 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.init.func8(0xc000ab7b08?) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/repos/common.go:378 +0x25 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.ActionFunc.Execute(0xc?, 0x2f7d017?) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:279 +0x19 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Apply(...) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:315 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x525d7a0, 0xc000ab7b08) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:348 +0xb3 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x5256220?, 0xc00163bc00?, 0x1f1cbd9?}, 0xc000ab7b08) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x525d860, 0xc000ab7b08) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:340 +0x2b > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x525f4e0?, 0x7f9d93337108?, 0x70?}, 0xc000ab7b08) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).runLoadedCatalog(0x5294fb0, {0xc000a06508?, 0xc000ddbe60?, 0x47?}, 0xc000ab7b08) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:129 +0x119 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).RunRulesOfCategory(0x5294fb0, {0x2f773c3, 0x2}, 0xc000ab7b08) > /tmp/tmp.ai5Z8MqDrc/magefiles/rulesengine/types.go:121 +0x1b4 > main.CI.TestE2E({}) > /tmp/tmp.ai5Z8MqDrc/magefiles/magefile.go:330 +0x18a > main.main.func19({0xc0005d64c0?, 0x178f88e?}) > /tmp/tmp.ai5Z8MqDrc/magefiles/mage_output_file.go:838 +0xf > main.main.func12.1() > /tmp/tmp.ai5Z8MqDrc/magefiles/mage_output_file.go:303 +0x5b > created by main.main.func12 in goroutine 1 > /tmp/tmp.ai5Z8MqDrc/magefiles/mage_output_file.go:298 +0xbe I0123 01:58:41.483070 21569 common.go:434] Registered PaC server: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-53555a9322.pkqy.p3.openshiftapps.com I0123 01:58:41.549129 21569 common.go:459] The PaC servers registered in Sprayproxy: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-53555a9322.pkqy.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-337d733b34.dv02.p3.openshiftapps.com I0123 01:58:41.549158 21569 common.go:475] going to create new Tekton bundle remote-build for the purpose of testing multi-platform-controller PR I0123 01:58:42.014705 21569 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.7@sha256:97b646590d8863a029c4105ed78391b7b2f378277da256b42c59a477fa87fca0 I0123 01:58:42.017035 21569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133521-ntgu -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I0123 01:58:43.057388 21569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133521-ntgu: quay.io/redhat-appstudio-qe/test-images@sha256:f18db3919bcb5561e06c6697e8f8f9c3c23ba5c9798e5a6b1a8d7476773b9336 I0123 01:58:43.057417 21569 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_ARM64 to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133521-ntgu I0123 01:58:43.271750 21569 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.7@sha256:97b646590d8863a029c4105ed78391b7b2f378277da256b42c59a477fa87fca0 I0123 01:58:43.274242 21569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133523-ovhz -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I0123 01:58:44.581706 21569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133523-ovhz: quay.io/redhat-appstudio-qe/test-images@sha256:d5cb8da54cb6449ebe7356d2c8b3533d051db50fe856947e5cfe2e5ff9c73b86 I0123 01:58:44.581738 21569 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_S390X to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133523-ovhz I0123 01:58:44.897904 21569 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.7@sha256:97b646590d8863a029c4105ed78391b7b2f378277da256b42c59a477fa87fca0 I0123 01:58:44.899859 21569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133524-knbp -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I0123 01:58:46.115836 21569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133524-knbp: quay.io/redhat-appstudio-qe/test-images@sha256:88ee39c086c97af6f5bae18f4fccb377ca74a0fc303337d73593e0a9f11c5381 I0123 01:58:46.115869 21569 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_PPC64LE to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1769133524-knbp exec: ginkgo "--seed=1769132742" "--timeout=1h30m0s" "--grace-period=30s" "--output-interceptor-mode=none" "--no-color" "--json-report=e2e-report.json" "--junit-report=e2e-report.xml" "--procs=20" "--nodes=20" "--p" "--output-dir=/workspace/artifact-dir" "./cmd" "--" go: downloading github.com/konflux-ci/build-service v0.0.0-20240611083846-2dee6cfe6fe4 go: downloading github.com/IBM/go-sdk-core/v5 v5.15.3 go: downloading github.com/IBM/vpc-go-sdk v0.48.0 go: downloading github.com/aws/aws-sdk-go-v2 v1.32.7 go: downloading github.com/aws/aws-sdk-go-v2/config v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/ec2 v1.135.0 go: downloading github.com/go-playground/validator/v10 v10.17.0 go: downloading github.com/go-openapi/strfmt v0.22.0 go: downloading github.com/google/go-github/v45 v45.2.0 go: downloading go.mongodb.org/mongo-driver v1.13.1 go: downloading github.com/mitchellh/mapstructure v1.5.0 go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 go: downloading github.com/oklog/ulid v1.3.1 go: downloading github.com/go-openapi/errors v0.21.0 go: downloading github.com/aws/smithy-go v1.22.1 go: downloading github.com/aws/aws-sdk-go-v2/credentials v1.17.48 go: downloading github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.22 go: downloading github.com/aws/aws-sdk-go-v2/service/sso v1.24.8 go: downloading github.com/aws/aws-sdk-go-v2/internal/ini v1.8.1 go: downloading github.com/aws/aws-sdk-go-v2/service/ssooidc v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/sts v1.33.3 go: downloading github.com/gabriel-vasile/mimetype v1.4.3 go: downloading github.com/go-playground/universal-translator v0.18.1 go: downloading github.com/leodido/go-urn v1.3.0 go: downloading github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.26 go: downloading github.com/go-playground/locales v0.14.1 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.1 go: downloading github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.26 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.7 Running Suite: Red Hat App Studio E2E tests - /tmp/tmp.ai5Z8MqDrc/cmd ===================================================================== Random Seed: 1769132742 Will run 354 of 388 specs Running in parallel across 20 processes ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if release CR is created [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/release_to_github.go:139 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies the release pipelinerun is running and succeeds [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/release_to_github.go:149 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies release CR completed and set succeeded. [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/release_to_github.go:182 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if the Release exists in github repo [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/release_to_github.go:193 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-host-pool] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:120 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-host-pool] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:124 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-host-pool] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:127 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-host-pool] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:148 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created test that cleanup happened successfully [multi-platform, aws-host-pool] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:152 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:251 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:255 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:259 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:263 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, aws-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:267 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmz-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:341 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmz-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:345 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmz-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:349 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmz-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:353 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmz-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:357 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmp-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:432 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmp-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:436 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmp-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:440 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmp-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:444 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmp-dynamic] /tmp/tmp.ai5Z8MqDrc/tests/build/multi-platform.go:448 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context points to a file [build-templates] /tmp/tmp.ai5Z8MqDrc/tests/build/tkn-bundle.go:177 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles from specific context [build-templates] /tmp/tmp.ai5Z8MqDrc/tests/build/tkn-bundle.go:188 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context is the root directory [build-templates] /tmp/tmp.ai5Z8MqDrc/tests/build/tkn-bundle.go:198 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when context points to a file and a directory [build-templates] /tmp/tmp.ai5Z8MqDrc/tests/build/tkn-bundle.go:207 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when using negation [build-templates] /tmp/tmp.ai5Z8MqDrc/tests/build/tkn-bundle.go:217 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding HOME environment variable [build-templates] /tmp/tmp.ai5Z8MqDrc/tests/build/tkn-bundle.go:227 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding STEP image [build-templates] /tmp/tmp.ai5Z8MqDrc/tests/build/tkn-bundle.go:236 ------------------------------ • [FAILED] [0.393 seconds] [release-pipelines-suite e2e tests for multi arch with rh-advisories pipeline] Multi arch test happy path [BeforeAll] Post-release verification verifies the release CR is created [release-pipelines, rh-advisories, multiarch-advisories, multiArchAdvisories] [BeforeAll] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/multiarch_advisories.go:61 [It] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/multiarch_advisories.go:113 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:19.873 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 01/23/26 02:00:19.873 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc000e4ecc0>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc0008b6870>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000bde460>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:19.873 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS ------------------------------ • [FAILED] [0.214 seconds] [release-pipelines-suite FBC e2e-tests] with FBC happy path [BeforeAll] Post-release verification creates component from git source https://github.com/redhat-appstudio-qe/fbc-sample-repo-test [release-pipelines, fbc-release, fbcHappyPath] [BeforeAll] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/fbc_release.go:89 [It] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/fbc_release.go:123 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:20.09 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 01/23/26 02:00:20.09 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc00086ce40>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc00018cf00>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc0005fc000>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:20.09 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSS ------------------------------ • [FAILED] [0.216 seconds] [release-pipelines-suite e2e tests for rh-advisories pipeline] Rh-advisories happy path [BeforeAll] Post-release verification verifies if release CR is created [release-pipelines, rh-advisories, rhAdvisories] [BeforeAll] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rh_advisories.go:61 [It] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rh_advisories.go:118 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:20.092 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 01/23/26 02:00:20.093 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc0010347e0>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000e50780>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000144050>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:20.092 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSSSS ------------------------------ • [FAILED] [0.417 seconds] [release-pipelines-suite e2e tests for rh-push-to-redhat-io pipeline] Rh-push-to-redhat-io happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rh-push-to-registry-redhat-io, PushToRedhatIO] [BeforeAll] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rh_push_to_registry_redhat_io.go:61 [It] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rh_push_to_registry_redhat_io.go:110 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:20.107 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 01/23/26 02:00:20.309 << Timeline [FAILED] Unexpected error: <*url.Error | 0xc0012748d0>: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/api/v1/namespaces/managed-release-team-tenant/secrets/pyxis", Err: <*net.OpError | 0xc000e6d630>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc000e6d4a0>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, } occurred In [BeforeAll] at: /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 @ 01/23/26 02:00:20.107 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS ------------------------------ • [FAILED] [3.730 seconds] [release-pipelines-suite e2e tests for rhtap-service-push pipeline] Rhtap-service-push happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rhtap-service-push, RhtapServicePush] [BeforeAll] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rhtap_service_push.go:75 [It] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rhtap_service_push.go:150 Timeline >> PR #3770 got created with sha b5bbdab0702e942788c63961502cb88354616536 merged result sha: 1333f420631617e310f0326eefdd8ad0a232f6ff for PR #3770 [FAILED] in [BeforeAll] - /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rhtap_service_push.go:119 @ 01/23/26 02:00:23.406 [PANICKED] in [AfterAll] - /usr/lib/golang/src/runtime/panic.go:262 @ 01/23/26 02:00:23.407 << Timeline [FAILED] Unexpected error: <*fmt.wrapError | 0xc0006baa00>: failed to get API group resources: unable to retrieve the complete list of server APIs: appstudio.redhat.com/v1alpha1: Get "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/apis/appstudio.redhat.com/v1alpha1": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host { msg: "failed to get API group resources: unable to retrieve the complete list of server APIs: appstudio.redhat.com/v1alpha1: Get \"https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/apis/appstudio.redhat.com/v1alpha1\": dial tcp: lookup api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com on 172.30.0.10:53: no such host", err: <*apiutil.ErrResourceDiscoveryFailed | 0xc000b22078>{ { Group: "appstudio.redhat.com", Version: "v1alpha1", }: <*url.Error | 0xc0015cb470>{ Op: "Get", URL: "https://api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com/apis/appstudio.redhat.com/v1alpha1", Err: <*net.OpError | 0xc000bd2000>{ Op: "dial", Net: "tcp", Source: nil, Addr: nil, Err: <*net.DNSError | 0xc00067e8c0>{ UnwrapErr: nil, Err: "no such host", Name: "api-toolchain-host-operator.apps.stone-stg-host.qc0p.p1.openshiftapps.com", Server: "172.30.0.10:53", IsTimeout: false, IsTemporary: false, IsNotFound: true, }, }, }, }, } occurred In [BeforeAll] at: /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rhtap_service_push.go:119 @ 01/23/26 02:00:23.406 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSS••••••• ------------------------------ • [PANICKED] [76.189 seconds] [upgrade-suite Create users and check their state] [It] Verify AppStudioProvisionedUser [upgrade-verify] /tmp/tmp.ai5Z8MqDrc/tests/upgrade/verifyWorkload.go:20 Timeline >> "msg"="Observed a panic: \"invalid memory address or nil pointer dereference\" (runtime error: invalid memory address or nil pointer dereference)\ngoroutine 159 [running]:\nk8s.io/apimachinery/pkg/util/runtime.logPanic({0x2c516c0, 0x5417910})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:75 +0x85\nk8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc000f84fc0?})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:49 +0x65\npanic({0x2c516c0?, 0x5417910?})\n\t/usr/lib/golang/src/runtime/panic.go:792 +0x132\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp.func1()\n\t/tmp/tmp.ai5Z8MqDrc/pkg/sandbox/sandbox.go:319 +0x35\ngithub.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval.func1({0xee6b2800?, 0x0?})\n\t/tmp/tmp.ai5Z8MqDrc/pkg/utils/util.go:129 +0x13\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1(0xc0007cabe0?, {0x385cca8?, 0xc00084b960?})\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:53 +0x52\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x385cca8, 0xc00084b960}, {0x3851770, 0xc0007cabe0}, 0x1, 0x0, 0xc0018a5e68)\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:54 +0x115\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x385cb58?, 0x54ce1c0?}, 0xee6b2800, 0x419be5?, 0x1, 0xc0018a5e68)\n\t/opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/poll.go:48 +0xa5\ngithub.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval(0xa?, 0xc0018a5eb0?, 0x1?)\n\t/tmp/tmp.ai5Z8MqDrc/pkg/utils/util.go:129 +0x45\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp(0x3244173?, {0x3244173?, 0x3240dab?}, 0x8?)\n\t/tmp/tmp.ai5Z8MqDrc/pkg/sandbox/sandbox.go:318 +0x72\ngithub.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreated(0x0, {0x3244173, 0x9})\n\t/tmp/tmp.ai5Z8MqDrc/pkg/sandbox/sandbox.go:314 +0x4b\ngithub.com/konflux-ci/e2e-tests/tests/upgrade/verify.VerifyAppStudioProvisionedUser(0xc001082ad8?)\n\t/tmp/tmp.ai5Z8MqDrc/tests/upgrade/verify/verifyUsers.go:14 +0x25\ngithub.com/konflux-ci/e2e-tests/tests/upgrade.init.func1.2()\n\t/tmp/tmp.ai5Z8MqDrc/tests/upgrade/verifyWorkload.go:21 +0x1a\ngithub.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x7c2986?, 0xc0010f2480?})\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/node.go:475 +0x13\ngithub.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func3()\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/suite.go:894 +0x7b\ncreated by github.com/onsi/ginkgo/v2/internal.(*Suite).runNode in goroutine 78\n\t/opt/app-root/src/go/pkg/mod/github.com/onsi/ginkgo/v2@v2.22.2/internal/suite.go:881 +0xd7b" "error"=null [PANICKED] in [It] - /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 @ 01/23/26 02:01:36.07 << Timeline [PANICKED] Test Panicked In [It] at: /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 @ 01/23/26 02:01:36.07 runtime error: invalid memory address or nil pointer dereference Full Stack Trace k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc000f84fc0?}) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 +0xc7 panic({0x2c516c0?, 0x5417910?}) /usr/lib/golang/src/runtime/panic.go:792 +0x132 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp.func1() /tmp/tmp.ai5Z8MqDrc/pkg/sandbox/sandbox.go:319 +0x35 github.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval.func1({0xee6b2800?, 0x0?}) /tmp/tmp.ai5Z8MqDrc/pkg/utils/util.go:129 +0x13 k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1(0xc0007cabe0?, {0x385cca8?, 0xc00084b960?}) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:53 +0x52 k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x385cca8, 0xc00084b960}, {0x3851770, 0xc0007cabe0}, 0x1, 0x0, 0xc00190fe68) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/loop.go:54 +0x115 k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x385cb58?, 0x54ce1c0?}, 0xee6b2800, 0x419be5?, 0x1, 0xc0018a5e68) /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/wait/poll.go:48 +0xa5 github.com/konflux-ci/e2e-tests/pkg/utils.WaitUntilWithInterval(0xa?, 0xc0018a5eb0?, 0x1?) /tmp/tmp.ai5Z8MqDrc/pkg/utils/util.go:129 +0x45 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreatedWithSignUp(0x3244173?, {0x3244173?, 0x3240dab?}, 0x8?) /tmp/tmp.ai5Z8MqDrc/pkg/sandbox/sandbox.go:318 +0x72 github.com/konflux-ci/e2e-tests/pkg/sandbox.(*SandboxController).CheckUserCreated(0x0, {0x3244173, 0x9}) /tmp/tmp.ai5Z8MqDrc/pkg/sandbox/sandbox.go:314 +0x4b github.com/konflux-ci/e2e-tests/tests/upgrade/verify.VerifyAppStudioProvisionedUser(0xc001082ad8?) /tmp/tmp.ai5Z8MqDrc/tests/upgrade/verify/verifyUsers.go:14 +0x25 github.com/konflux-ci/e2e-tests/tests/upgrade.init.func1.2() /tmp/tmp.ai5Z8MqDrc/tests/upgrade/verifyWorkload.go:21 +0x1a ------------------------------ SS•••••••••••• ------------------------------ • [FAILED] [126.791 seconds] [integration-service-suite Gitlab Status Reporting of Integration tests] Gitlab with status reporting of Integration tests in the assosiated merge request [BeforeAll] when a new Component with specified custom branch is created triggers a Build PipelineRun [integration-service, gitlab-status-reporting, custom-branch] [BeforeAll] /tmp/tmp.ai5Z8MqDrc/tests/integration-service/gitlab-integration-reporting.go:46 [It] /tmp/tmp.ai5Z8MqDrc/tests/integration-service/gitlab-integration-reporting.go:121 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.ai5Z8MqDrc/tests/integration-service/gitlab-integration-reporting.go:62 @ 01/23/26 02:02:26.129 [FAILED] in [AfterAll] - /tmp/tmp.ai5Z8MqDrc/tests/integration-service/gitlab-integration-reporting.go:88 @ 01/23/26 02:02:26.348 << Timeline [FAILED] Unexpected error: <*errors.StatusError | 0xc000a69c20>: admission webhook "dintegrationtestscenario.kb.io" denied the request: could not find application 'integ-app-nswi' in namespace 'gitlab-rep-nyuj' { ErrStatus: { TypeMeta: {Kind: "", APIVersion: ""}, ListMeta: { SelfLink: "", ResourceVersion: "", Continue: "", RemainingItemCount: nil, }, Status: "Failure", Message: "admission webhook \"dintegrationtestscenario.kb.io\" denied the request: could not find application 'integ-app-nswi' in namespace 'gitlab-rep-nyuj'", Reason: "Forbidden", Details: nil, Code: 403, }, } occurred In [BeforeAll] at: /tmp/tmp.ai5Z8MqDrc/tests/integration-service/gitlab-integration-reporting.go:62 @ 01/23/26 02:02:26.129 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSSSSSSSSSSSSSS•••••••••••••••••••••••••••••••••••••••••••••••••• ------------------------------ P [PENDING] [build-service-suite Build service E2E tests] test build secret lookup when two secrets are created when second component is deleted, pac pr branch should not exist in the repo [build-service, pac-build, secret-lookup] /tmp/tmp.ai5Z8MqDrc/tests/build/build.go:1126 ------------------------------ • ------------------------------ • [FAILED] [0.304 seconds] [release-pipelines-suite [HACBS-1571]test-release-e2e-push-image-to-pyxis] Post-release verification [It] validate the result of task create-pyxis-image contains image ids [release-pipelines, rh-push-to-external-registry] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rh_push_to_external_registry.go:233 [FAILED] Unexpected error: <*errors.errorString | 0xc0009acd80>: task with create-pyxis-image name doesn't exist in managed-vhbbt pipelinerun { s: "task with create-pyxis-image name doesn't exist in managed-vhbbt pipelinerun", } occurred In [It] at: /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rh_push_to_external_registry.go:236 @ 01/23/26 02:06:21.275 ------------------------------ SS••••••••••••••••••••••••••••• ------------------------------ • [FAILED] [24.236 seconds] [enterprise-contract-suite Conforma E2E tests] test creating and signing an image and task verify-enterprise-contract task Release Policy [It] verifies redhat products pass the redhat policy rule collection before release [ec, pipeline] /tmp/tmp.ai5Z8MqDrc/tests/enterprise-contract/contract.go:347 Timeline >> Update public key to verify golden images Creating Pipeline "verify-enterprise-contract-run-m4csz" Waiting for pipeline "verify-enterprise-contract-run-m4csz" to finish [FAILED] in [It] - /tmp/tmp.ai5Z8MqDrc/tests/enterprise-contract/contract.go:380 @ 01/23/26 02:11:53.074 << Timeline [FAILED] Expected <[]v1.TaskRunResult | len:1, cap:4>: [ { Name: "TEST_OUTPUT", Type: "string", Value: { Type: "string", StringVal: "{\"timestamp\":\"1769134311\",\"namespace\":\"\",\"successes\":360,\"failures\":6,\"warnings\":18,\"result\":\"FAILURE\"}\n", ArrayVal: nil, ObjectVal: nil, }, }, ] not to contain elements <[]*tekton.TaskRunResultMatcher | len:1, cap:1>: [ { name: "TEST_OUTPUT", jsonPath: "{$.result}", value: nil, jsonValue: "[\"FAILURE\"]", jsonMatcher: <*matchers.MatchJSONMatcher | 0xc00130d830>{ JSONToMatch: "[\"FAILURE\"]", firstFailurePath: nil, }, }, ] In [It] at: /tmp/tmp.ai5Z8MqDrc/tests/enterprise-contract/contract.go:380 @ 01/23/26 02:11:53.074 ------------------------------ SS•••••••••••S•S• ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.ai5Z8MqDrc/tests/build/build_templates.go:489 ------------------------------ ••••••••••••••••••••••••S•••••••••••••S•S• ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build-oci-ta should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.ai5Z8MqDrc/tests/build/build_templates.go:489 ------------------------------ •••••S••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••• ------------------------------ • [FAILED] [902.554 seconds] [integration-service-suite Creation of group snapshots for monorepo and multiple repos] with status reporting of Integration tests in CheckRuns when IntegrationTestScenario reference to task as pipelinerun resolution [It] trigger pipelinerun for invalid integrationTestScenario by annotating snapshot and verify failing to create integration pipelinerun [integration-service, group-snapshot-creation] /tmp/tmp.ai5Z8MqDrc/tests/integration-service/group-snapshots-tests.go:730 [FAILED] Timed out after 900.001s. timeout while waiting for group snapshot and failing integration pipelinerun with invalid resolution Expected success, but got an error: <*errors.errorString | 0xc000d7f950>: failing to find the integration test status detail group-oqvh/integ-app-ikny-20260123-022839-980 for invalid resolution { s: "failing to find the integration test status detail group-oqvh/integ-app-ikny-20260123-022839-980 for invalid resolution", } In [It] at: /tmp/tmp.ai5Z8MqDrc/tests/integration-service/group-snapshots-tests.go:753 @ 01/23/26 02:43:46.653 ------------------------------ Summarizing 10 Failures: [PANICKED!] [upgrade-suite Create users and check their state] [It] Verify AppStudioProvisionedUser [upgrade-verify] /opt/app-root/src/go/pkg/mod/k8s.io/apimachinery@v0.29.4/pkg/util/runtime/runtime.go:56 [FAIL] [integration-service-suite Gitlab Status Reporting of Integration tests] Gitlab with status reporting of Integration tests in the assosiated merge request [BeforeAll] when a new Component with specified custom branch is created triggers a Build PipelineRun [integration-service, gitlab-status-reporting, custom-branch] /tmp/tmp.ai5Z8MqDrc/tests/integration-service/gitlab-integration-reporting.go:62 [FAIL] [release-pipelines-suite e2e tests for rhtap-service-push pipeline] Rhtap-service-push happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rhtap-service-push, RhtapServicePush] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rhtap_service_push.go:119 [FAIL] [release-pipelines-suite e2e tests for rh-push-to-redhat-io pipeline] Rh-push-to-redhat-io happy path [BeforeAll] Post-release verification verifies if the release CR is created [release-pipelines, rh-push-to-registry-redhat-io, PushToRedhatIO] /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 [FAIL] [release-pipelines-suite e2e tests for rh-advisories pipeline] Rh-advisories happy path [BeforeAll] Post-release verification verifies if release CR is created [release-pipelines, rh-advisories, rhAdvisories] /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 [FAIL] [release-pipelines-suite [HACBS-1571]test-release-e2e-push-image-to-pyxis] Post-release verification [It] validate the result of task create-pyxis-image contains image ids [release-pipelines, rh-push-to-external-registry] /tmp/tmp.ai5Z8MqDrc/tests/release/pipelines/rh_push_to_external_registry.go:236 [FAIL] [enterprise-contract-suite Conforma E2E tests] test creating and signing an image and task verify-enterprise-contract task Release Policy [It] verifies redhat products pass the redhat policy rule collection before release [ec, pipeline] /tmp/tmp.ai5Z8MqDrc/tests/enterprise-contract/contract.go:380 [FAIL] [release-pipelines-suite FBC e2e-tests] with FBC happy path [BeforeAll] Post-release verification creates component from git source https://github.com/redhat-appstudio-qe/fbc-sample-repo-test [release-pipelines, fbc-release, fbcHappyPath] /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 [FAIL] [release-pipelines-suite e2e tests for multi arch with rh-advisories pipeline] Multi arch test happy path [BeforeAll] Post-release verification verifies the release CR is created [release-pipelines, rh-advisories, multiarch-advisories, multiArchAdvisories] /tmp/tmp.ai5Z8MqDrc/tests/release/releaseLib.go:322 [FAIL] [integration-service-suite Creation of group snapshots for monorepo and multiple repos] with status reporting of Integration tests in CheckRuns when IntegrationTestScenario reference to task as pipelinerun resolution [It] trigger pipelinerun for invalid integrationTestScenario by annotating snapshot and verify failing to create integration pipelinerun [integration-service, group-snapshot-creation] /tmp/tmp.ai5Z8MqDrc/tests/integration-service/group-snapshots-tests.go:753 Ran 295 of 388 Specs in 2609.765 seconds FAIL! -- 285 Passed | 10 Failed | 34 Pending | 59 Skipped Ginkgo ran 1 suite in 45m3.087417064s Test Suite Failed Error: running "ginkgo --seed=1769132742 --timeout=1h30m0s --grace-period=30s --output-interceptor-mode=none --no-color --json-report=e2e-report.json --junit-report=e2e-report.xml --procs=20 --nodes=20 --p --output-dir=/workspace/artifact-dir ./cmd --" failed with exit code 1 make: *** [Makefile:25: ci/test/e2e] Error 1