./mage -v ci:teste2e Running target: CI:TestE2E I1105 19:40:21.603883 16569 magefile.go:529] setting up new custom bundle for testing... I1105 19:40:21.952688 16569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762371621-hcxt -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: docker-build to image I1105 19:40:23.513290 16569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762371621-hcxt: quay.io/redhat-appstudio-qe/test-images@sha256:6afe055591de7c643de7ed65b0252df12112838531fefd43b8fdc22874c9be77 I1105 19:40:23.513324 16569 magefile.go:535] To use the custom docker bundle locally, run below cmd: export CUSTOM_DOCKER_BUILD_PIPELINE_BUNDLE=quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762371621-hcxt I1105 19:40:23.513354 16569 integration_service.go:49] checking if repository is integration-service I1105 19:40:23.513363 16569 image_controller.go:49] checking if repository is image-controller I1105 19:40:23.513369 16569 build_service.go:49] checking if repository is build-service I1105 19:40:23.513375 16569 e2e_repo.go:347] checking if repository is e2e-tests I1105 19:40:23.513379 16569 e2e_repo.go:335] multi-platform tests and require sprayproxy registering are set to TRUE exec: git "diff" "--name-status" "upstream/main..HEAD" I1105 19:40:23.516693 16569 util.go:451] The following files, pkg/clients/has/components.go, pkg/clients/tekton/pipelineruns.go, tests/build/build_templates.go, were changed! exec: go "install" "-mod=mod" "github.com/onsi/ginkgo/v2/ginkgo" go: downloading github.com/go-task/slim-sprig/v3 v3.0.0 go: downloading github.com/google/pprof v0.0.0-20241210010833-40e02aabc2ad I1105 19:40:26.652695 16569 install.go:188] cloning 'https://github.com/redhat-appstudio/infra-deployments' with git ref 'refs/heads/main' Enumerating objects: 67469, done. Counting objects: 0% (1/145) Counting objects: 1% (2/145) Counting objects: 2% (3/145) Counting objects: 3% (5/145) Counting objects: 4% (6/145) Counting objects: 5% (8/145) Counting objects: 6% (9/145) Counting objects: 7% (11/145) Counting objects: 8% (12/145) Counting objects: 9% (14/145) Counting objects: 10% (15/145) Counting objects: 11% (16/145) Counting objects: 12% (18/145) Counting objects: 13% (19/145) Counting objects: 14% (21/145) Counting objects: 15% (22/145) Counting objects: 16% (24/145) Counting objects: 17% (25/145) Counting objects: 18% (27/145) Counting objects: 19% (28/145) Counting objects: 20% (29/145) Counting objects: 21% (31/145) Counting objects: 22% (32/145) Counting objects: 23% (34/145) Counting objects: 24% (35/145) Counting objects: 25% (37/145) Counting objects: 26% (38/145) Counting objects: 27% (40/145) Counting objects: 28% (41/145) Counting objects: 29% (43/145) Counting objects: 30% (44/145) Counting objects: 31% (45/145) Counting objects: 32% (47/145) Counting objects: 33% (48/145) Counting objects: 34% (50/145) Counting objects: 35% (51/145) Counting objects: 36% (53/145) Counting objects: 37% (54/145) Counting objects: 38% (56/145) Counting objects: 39% (57/145) Counting objects: 40% (58/145) Counting objects: 41% (60/145) Counting objects: 42% (61/145) Counting objects: 43% (63/145) Counting objects: 44% (64/145) Counting objects: 45% (66/145) Counting objects: 46% (67/145) Counting objects: 47% (69/145) Counting objects: 48% (70/145) Counting objects: 49% (72/145) Counting objects: 50% (73/145) Counting objects: 51% (74/145) Counting objects: 52% (76/145) Counting objects: 53% (77/145) Counting objects: 54% (79/145) Counting objects: 55% (80/145) Counting objects: 56% (82/145) Counting objects: 57% (83/145) Counting objects: 58% (85/145) Counting objects: 59% (86/145) Counting objects: 60% (87/145) Counting objects: 61% (89/145) Counting objects: 62% (90/145) Counting objects: 63% (92/145) Counting objects: 64% (93/145) Counting objects: 65% (95/145) Counting objects: 66% (96/145) Counting objects: 67% (98/145) Counting objects: 68% (99/145) Counting objects: 69% (101/145) Counting objects: 70% (102/145) Counting objects: 71% (103/145) Counting objects: 72% (105/145) Counting objects: 73% (106/145) Counting objects: 74% (108/145) Counting objects: 75% (109/145) Counting objects: 76% (111/145) Counting objects: 77% (112/145) Counting objects: 78% (114/145) Counting objects: 79% (115/145) Counting objects: 80% (116/145) Counting objects: 81% (118/145) Counting objects: 82% (119/145) Counting objects: 83% (121/145) Counting objects: 84% (122/145) Counting objects: 85% (124/145) Counting objects: 86% (125/145) Counting objects: 87% (127/145) Counting objects: 88% (128/145) Counting objects: 89% (130/145) Counting objects: 90% (131/145) Counting objects: 91% (132/145) Counting objects: 92% (134/145) Counting objects: 93% (135/145) Counting objects: 94% (137/145) Counting objects: 95% (138/145) Counting objects: 96% (140/145) Counting objects: 97% (141/145) Counting objects: 98% (143/145) Counting objects: 99% (144/145) Counting objects: 100% (145/145) Counting objects: 100% (145/145), done. Compressing objects: 1% (1/75) Compressing objects: 2% (2/75) Compressing objects: 4% (3/75) Compressing objects: 5% (4/75) Compressing objects: 6% (5/75) Compressing objects: 8% (6/75) Compressing objects: 9% (7/75) Compressing objects: 10% (8/75) Compressing objects: 12% (9/75) Compressing objects: 13% (10/75) Compressing objects: 14% (11/75) Compressing objects: 16% (12/75) Compressing objects: 17% (13/75) Compressing objects: 18% (14/75) Compressing objects: 20% (15/75) Compressing objects: 21% (16/75) Compressing objects: 22% (17/75) Compressing objects: 24% (18/75) Compressing objects: 25% (19/75) Compressing objects: 26% (20/75) Compressing objects: 28% (21/75) Compressing objects: 29% (22/75) Compressing objects: 30% (23/75) Compressing objects: 32% (24/75) Compressing objects: 33% (25/75) Compressing objects: 34% (26/75) Compressing objects: 36% (27/75) Compressing objects: 37% (28/75) Compressing objects: 38% (29/75) Compressing objects: 40% (30/75) Compressing objects: 41% (31/75) Compressing objects: 42% (32/75) Compressing objects: 44% (33/75) Compressing objects: 45% (34/75) Compressing objects: 46% (35/75) Compressing objects: 48% (36/75) Compressing objects: 49% (37/75) Compressing objects: 50% (38/75) Compressing objects: 52% (39/75) Compressing objects: 53% (40/75) Compressing objects: 54% (41/75) Compressing objects: 56% (42/75) Compressing objects: 57% (43/75) Compressing objects: 58% (44/75) Compressing objects: 60% (45/75) Compressing objects: 61% (46/75) Compressing objects: 62% (47/75) Compressing objects: 64% (48/75) Compressing objects: 65% (49/75) Compressing objects: 66% (50/75) Compressing objects: 68% (51/75) Compressing objects: 69% (52/75) Compressing objects: 70% (53/75) Compressing objects: 72% (54/75) Compressing objects: 73% (55/75) Compressing objects: 74% (56/75) Compressing objects: 76% (57/75) Compressing objects: 77% (58/75) Compressing objects: 78% (59/75) Compressing objects: 80% (60/75) Compressing objects: 81% (61/75) Compressing objects: 82% (62/75) Compressing objects: 84% (63/75) Compressing objects: 85% (64/75) Compressing objects: 86% (65/75) Compressing objects: 88% (66/75) Compressing objects: 89% (67/75) Compressing objects: 90% (68/75) Compressing objects: 92% (69/75) Compressing objects: 93% (70/75) Compressing objects: 94% (71/75) Compressing objects: 96% (72/75) Compressing objects: 97% (73/75) Compressing objects: 98% (74/75) Compressing objects: 100% (75/75) Compressing objects: 100% (75/75), done. Total 67469 (delta 104), reused 82 (delta 69), pack-reused 67324 (from 5) From https://github.com/redhat-appstudio/infra-deployments * branch main -> FETCH_HEAD Already up to date. Installing the OpenShift GitOps operator subscription: clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller created clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server created subscription.operators.coreos.com/openshift-gitops-operator created Waiting for default project (and namespace) to exist: ................................OK Waiting for OpenShift GitOps Route: OK argocd.argoproj.io/openshift-gitops patched argocd.argoproj.io/openshift-gitops patched Switch the Route to use re-encryption argocd.argoproj.io/openshift-gitops patched Restarting ArgoCD Server pod "openshift-gitops-server-78868c5878-l4dsw" deleted Allow any authenticated users to be admin on the Argo CD instance argocd.argoproj.io/openshift-gitops patched Mark Pending PVC as Healthy, workaround for WaitForFirstConsumer StorageClasses. Warning: unknown field "spec.resourceCustomizations" argocd.argoproj.io/openshift-gitops patched (no change) Setting kustomize build options argocd.argoproj.io/openshift-gitops patched Setting ignore Aggregated Roles argocd.argoproj.io/openshift-gitops patched Setting ArgoCD tracking method to annotation argocd.argoproj.io/openshift-gitops patched Restarting GitOps server deployment.apps/openshift-gitops-server restarted ========================================================================= Argo CD URL is: https://openshift-gitops-server-openshift-gitops.apps.rosa.kx-6f23184ad7.4331.p3.openshiftapps.com (NOTE: It may take a few moments for the route to become available) Waiting for the route: .OK Login/password uses your OpenShift credentials ('Login with OpenShift' button) Setting secrets for Quality Dashboard namespace/quality-dashboard created secret/quality-dashboard-secrets created Creating secret for CI Helper App namespace/ci-helper-app created secret/ci-helper-app-secrets created Setting secrets for pipeline-service tekton-results namespace already exists, skipping creation tekton-logging namespace already exists, skipping creation namespace/product-kubearchive-logging created Creating DB secret secret/tekton-results-database created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config secret/minio-storage-configuration created Creating S3 secret secret/tekton-results-s3 created Creating MinIO config MinIO config already exists, skipping creation Creating Postgres TLS certs ...+.+...+++++++++++++++++++++++++++++++++++++++*........+.....+++++++++++++++++++++++++++++++++++++++*....+....+.....+.+........+..................+....+.....+......+...+.......+.........+..................+......+..............+............+.+..+.+.....+......+......+....+...........+.........+.++++++ .+.+++++++++++++++++++++++++++++++++++++++*....+...+..+...+...+.+.....+......+++++++++++++++++++++++++++++++++++++++*..+....+.........+.........+..+...+.......+.....+.+.....+...+.+......+..+..........+..+....+.....+......+.....................+...+....+.....+...+...+..........+......+......+........+....+........+.............+.....+.......++++++ ----- Certificate request self-signature ok subject=CN=cluster.local .+...+.....+.........+...+....+...............+...+..+.+++++++++++++++++++++++++++++++++++++++*.+..+....+.....+.......+..+.+++++++++++++++++++++++++++++++++++++++*...+...+..+...+...............+.......+..................+......+.....+.............+........+.......+.....+.+.....+......+......+.........+......+..........+...+..+.......+..+.+.........+...+..................+..+.+.........+.....+.......+....................+......+..........+........+....+..+....+......+.....+......+...+...................+...........+.+......+..+.+.....+.......+.....+....+.....+.........+....+..+..................+.......+......+..+.+..+....+.....+.+...+......+...........+............+......+......+.........+.....................+.........+....+........+....+.....+.+...+.....+...............+......+.........+......+.+.....+..........+..+.+......+...+..................+.....+....+.....+.+...............+...+.....++++++ ....+.......+...............+.....+...+...+.......+++++++++++++++++++++++++++++++++++++++*.+.....+.+++++++++++++++++++++++++++++++++++++++*......+......+......+...+....+...+...+..............................+......+..............+....+.....+.......+........+.+..+..........+..+...+.............+.........+..+....+..+......................+...+..+..........+..+.+........+............+.........+.............+..+.........+....+.....+..........+.....+.+......+........+...+..........+...+.....+.+...............+..............+......+....+...........+.+.........+.....+....+............+...+..+.......+..+...+.+.....+.+........+.+......+...........+...+............+.......+............+........+..........+......+.........+.....+............+.+..+......+.........+...+.........+.+............+.....+..........+......+..+...+....+.....+..........+...+........+.......+.....+.......+..++++++ ----- Certificate request self-signature ok subject=CN=postgres-postgresql.tekton-results.svc.cluster.local secret/postgresql-tls created configmap/rds-root-crt created namespace/application-service created Creating a has secret from legacy token secret/has-github-token created Creating a secret with a token for Image Controller namespace/image-controller created secret/quaytoken created Configuring the cluster with a pull secret for Docker Hub Saved credentials for docker.io into /tmp/tmp.lGswt46ufd secret/pull-secret data updated Saved credentials for docker.io into /tmp/tmp.lGswt46ufd secret/docker-io-pull created Setting secrets for Dora metrics exporter namespace/dora-metrics created secret/exporters-secret created Setting Cluster Mode: preview Switched to a new branch 'preview-main-bmhi' labeling node/ip-10-0-137-225.ec2.internal... node/ip-10-0-137-225.ec2.internal labeled successfully labeled node/ip-10-0-137-225.ec2.internal labeling node/ip-10-0-153-179.ec2.internal... node/ip-10-0-153-179.ec2.internal labeled successfully labeled node/ip-10-0-153-179.ec2.internal labeling node/ip-10-0-165-237.ec2.internal... node/ip-10-0-165-237.ec2.internal labeled successfully labeled node/ip-10-0-165-237.ec2.internal verifying labels... all nodes labeled successfully. Detected OCP minor version: 17 Changing AppStudio Gitlab Org to "redhat-appstudio-qe" [preview-main-bmhi e13df3c92] Preview mode, do not merge into main 6 files changed, 12 insertions(+), 18 deletions(-) remote: remote: Create a pull request for 'preview-main-bmhi' on GitHub by visiting: remote: https://github.com/redhat-appstudio-qe/infra-deployments/pull/new/preview-main-bmhi remote: To https://github.com/redhat-appstudio-qe/infra-deployments.git * [new branch] preview-main-bmhi -> preview-main-bmhi branch 'preview-main-bmhi' set up to track 'qe/preview-main-bmhi'. application.argoproj.io/all-application-sets created Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app Waiting for sync of all-application-sets argoCD app application.argoproj.io/dora-metrics-in-cluster-local patched application.argoproj.io/application-api-in-cluster-local patched application.argoproj.io/cert-manager-in-cluster-local patched application.argoproj.io/integration-in-cluster-local patched application.argoproj.io/mintmaker-in-cluster-local patched application.argoproj.io/knative-eventing-in-cluster-local patched application.argoproj.io/disable-csvcopy-in-cluster-local patched application.argoproj.io/build-templates-in-cluster-local patched application.argoproj.io/pipeline-service-in-cluster-local patched application.argoproj.io/kyverno-in-cluster-local patched application.argoproj.io/has-in-cluster-local patched application.argoproj.io/vector-kubearchive-log-collector-in-cluster-local patched (no change) application.argoproj.io/multi-platform-controller-in-cluster-local patched application.argoproj.io/crossplane-control-plane-in-cluster-local patched application.argoproj.io/all-application-sets patched application.argoproj.io/konflux-kite-in-cluster-local patched application.argoproj.io/repository-validator-in-cluster-local patched application.argoproj.io/monitoring-workload-prometheus-in-cluster-local patched application.argoproj.io/image-controller-in-cluster-local patched application.argoproj.io/monitoring-workload-grafana-in-cluster-local patched application.argoproj.io/squid-in-cluster-local patched application.argoproj.io/trust-manager-in-cluster-local patched application.argoproj.io/kubearchive-in-cluster-local patched application.argoproj.io/konflux-rbac-in-cluster-local patched application.argoproj.io/enterprise-contract-in-cluster-local patched application.argoproj.io/perf-team-prometheus-reader-in-cluster-local patched application.argoproj.io/policies-in-cluster-local patched application.argoproj.io/project-controller-in-cluster-local patched application.argoproj.io/release-in-cluster-local patched application.argoproj.io/vector-tekton-logs-collector-in-cluster-local patched application.argoproj.io/tracing-workload-tracing-in-cluster-local patched application.argoproj.io/tempo-in-cluster-local patched application.argoproj.io/kueue-in-cluster-local patched application.argoproj.io/tracing-workload-otel-collector-in-cluster-local patched application.argoproj.io/internal-services-in-cluster-local patched application.argoproj.io/build-service-in-cluster-local patched build-service-in-cluster-local Synced Progressing crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing has-in-cluster-local Synced Progressing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-kite-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy project-controller-in-cluster-local OutOfSync Missing release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Missing tracing-workload-otel-collector-in-cluster-local OutOfSync Progressing tracing-workload-tracing-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing vector-tekton-logs-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-kite-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy postgres project-controller-in-cluster-local OutOfSync Missing release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Missing tracing-workload-otel-collector-in-cluster-local Synced Progressing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing vector-tekton-logs-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-kite-in-cluster-local OutOfSync Progressing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy postgres OutOfSync Missing release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Missing tracing-workload-otel-collector-in-cluster-local Synced Progressing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing vector-tekton-logs-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing crossplane-control-plane-in-cluster-local OutOfSync Healthy enterprise-contract-in-cluster-local OutOfSync Missing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-kite-in-cluster-local OutOfSync Progressing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Degraded policies-in-cluster-local OutOfSync Healthy postgres Synced Progressing release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing vector-tekton-logs-collector-in-cluster-local OutOfSync Healthy Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing konflux-rbac-in-cluster-local OutOfSync Healthy kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Degraded policies-in-cluster-local OutOfSync Healthy postgres Synced Progressing release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Degraded policies-in-cluster-local OutOfSync Healthy postgres Synced Progressing release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Missing trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing image-controller-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing internal-services-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Missing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Degraded kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local OutOfSync Missing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing enterprise-contract-in-cluster-local OutOfSync Missing integration-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy tracing-workload-tracing-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local OutOfSync Progressing Waiting 10 seconds for application sync build-service-in-cluster-local Synced Progressing integration-in-cluster-local OutOfSync Missing knative-eventing-in-cluster-local OutOfSync Missing kubearchive-in-cluster-local OutOfSync Missing kueue-in-cluster-local OutOfSync Missing kyverno-in-cluster-local OutOfSync Missing mintmaker-in-cluster-local OutOfSync Missing monitoring-workload-grafana-in-cluster-local OutOfSync Missing multi-platform-controller-in-cluster-local OutOfSync Missing pipeline-service-in-cluster-local OutOfSync Missing policies-in-cluster-local OutOfSync Healthy release-in-cluster-local Synced Progressing squid-in-cluster-local OutOfSync Healthy tracing-workload-tracing-in-cluster-local OutOfSync Healthy trust-manager-in-cluster-local OutOfSync Missing vector-kubearchive-log-collector-in-cluster-local Unknown Healthy vector-kubearchive-log-collector-in-cluster-local failed with: [{"lastTransitionTime":"2025-11-05T19:44:58Z","message":"Failed to load target state: failed to generate manifest for source 1 of 1: rpc error: code = Unknown desc = `kustomize build \u003cpath to cached source\u003e/components/vector-kubearchive-log-collector/development --enable-helm --helm-kube-version 1.30 --helm-api-versions admissionregistration.k8s.io/v1 --helm-api-versions admissionregistration.k8s.io/v1/MutatingWebhookConfiguration --helm-api-versions admissionregistration.k8s.io/v1/ValidatingAdmissionPolicy --helm-api-versions admissionregistration.k8s.io/v1/ValidatingAdmissionPolicyBinding --helm-api-versions admissionregistration.k8s.io/v1/ValidatingWebhookConfiguration --helm-api-versions admissionregistration.k8s.io/v1beta1 --helm-api-versions admissionregistration.k8s.io/v1beta1/ValidatingAdmissionPolicy --helm-api-versions admissionregistration.k8s.io/v1beta1/ValidatingAdmissionPolicyBinding --helm-api-versions apiextensions.crossplane.io/v1 --helm-api-versions apiextensions.crossplane.io/v1/CompositeResourceDefinition --helm-api-versions apiextensions.crossplane.io/v1/Composition --helm-api-versions apiextensions.crossplane.io/v1/CompositionRevision --helm-api-versions apiextensions.crossplane.io/v1alpha1 --helm-api-versions apiextensions.crossplane.io/v1alpha1/ManagedResourceActivationPolicy --helm-api-versions apiextensions.crossplane.io/v1alpha1/ManagedResourceDefinition --helm-api-versions apiextensions.crossplane.io/v1alpha1/Usage --helm-api-versions apiextensions.crossplane.io/v1beta1 --helm-api-versions apiextensions.crossplane.io/v1beta1/EnvironmentConfig --helm-api-versions apiextensions.crossplane.io/v1beta1/Usage --helm-api-versions apiextensions.crossplane.io/v2 --helm-api-versions apiextensions.crossplane.io/v2/CompositeResourceDefinition --helm-api-versions apiextensions.k8s.io/v1 --helm-api-versions apiextensions.k8s.io/v1/CustomResourceDefinition --helm-api-versions apiregistration.k8s.io/v1 --helm-api-versions apiregistration.k8s.io/v1/APIService --helm-api-versions apiserver.openshift.io/v1 --helm-api-versions apiserver.openshift.io/v1/APIRequestCount --helm-api-versions apps.openshift.io/v1 --helm-api-versions apps.openshift.io/v1/DeploymentConfig --helm-api-versions apps/v1 --helm-api-versions apps/v1/ControllerRevision --helm-api-versions apps/v1/DaemonSet --helm-api-versions apps/v1/Deployment --helm-api-versions apps/v1/ReplicaSet --helm-api-versions apps/v1/StatefulSet --helm-api-versions appstudio.redhat.com/v1alpha1 --helm-api-versions appstudio.redhat.com/v1alpha1/Application --helm-api-versions appstudio.redhat.com/v1alpha1/Component --helm-api-versions appstudio.redhat.com/v1alpha1/ComponentDetectionQuery --helm-api-versions appstudio.redhat.com/v1alpha1/DeploymentTarget --helm-api-versions appstudio.redhat.com/v1alpha1/DeploymentTargetClaim --helm-api-versions appstudio.redhat.com/v1alpha1/DeploymentTargetClass --helm-api-versions appstudio.redhat.com/v1alpha1/EnterpriseContractPolicy --helm-api-versions appstudio.redhat.com/v1alpha1/Environment --helm-api-versions appstudio.redhat.com/v1alpha1/ImageRepository --helm-api-versions appstudio.redhat.com/v1alpha1/InternalRequest --helm-api-versions appstudio.redhat.com/v1alpha1/InternalServicesConfig --helm-api-versions appstudio.redhat.com/v1alpha1/PromotionRun --helm-api-versions appstudio.redhat.com/v1alpha1/Release --helm-api-versions appstudio.redhat.com/v1alpha1/ReleasePlan --helm-api-versions appstudio.redhat.com/v1alpha1/ReleasePlanAdmission --helm-api-versions appstudio.redhat.com/v1alpha1/ReleaseServiceConfig --helm-api-versions appstudio.redhat.com/v1alpha1/Snapshot --helm-api-versions appstudio.redhat.com/v1alpha1/SnapshotEnvironmentBinding --helm-api-versions argoproj.io/v1alpha1 --helm-api-versions argoproj.io/v1alpha1/AnalysisRun --helm-api-versions argoproj.io/v1alpha1/AnalysisTemplate --helm-api-versions argoproj.io/v1alpha1/AppProject --helm-api-versions argoproj.io/v1alpha1/Application --helm-api-versions argoproj.io/v1alpha1/ApplicationSet --helm-api-versions argoproj.io/v1alpha1/ArgoCD --helm-api-versions argoproj.io/v1alpha1/ClusterAnalysisTemplate --helm-api-versions argoproj.io/v1alpha1/Experiment --helm-api-versions argoproj.io/v1alpha1/NotificationsConfiguration --helm-api-versions argoproj.io/v1alpha1/Rollout --helm-api-versions argoproj.io/v1alpha1/RolloutManager --helm-api-versions argoproj.io/v1beta1 --helm-api-versions argoproj.io/v1beta1/ArgoCD --helm-api-versions authorization.openshift.io/v1 --helm-api-versions authorization.openshift.io/v1/RoleBindingRestriction --helm-api-versions autoscaling/v1 --helm-api-versions autoscaling/v1/HorizontalPodAutoscaler --helm-api-versions autoscaling/v2 --helm-api-versions autoscaling/v2/HorizontalPodAutoscaler --helm-api-versions batch/v1 --helm-api-versions batch/v1/CronJob --helm-api-versions batch/v1/Job --helm-api-versions build.openshift.io/v1 --helm-api-versions build.openshift.io/v1/Build --helm-api-versions build.openshift.io/v1/BuildConfig --helm-api-versions certificates.k8s.io/v1 --helm-api-versions certificates.k8s.io/v1/CertificateSigningRequest --helm-api-versions cloud.network.openshift.io/v1 --helm-api-versions cloud.network.openshift.io/v1/CloudPrivateIPConfig --helm-api-versions cloudcredential.openshift.io/v1 --helm-api-versions cloudcredential.openshift.io/v1/CredentialsRequest --helm-api-versions cluster.open-cluster-management.io/v1alpha1 --helm-api-versions cluster.open-cluster-management.io/v1alpha1/ClusterClaim --helm-api-versions config.openshift.io/v1 --helm-api-versions config.openshift.io/v1/APIServer --helm-api-versions config.openshift.io/v1/Authentication --helm-api-versions config.openshift.io/v1/Build --helm-api-versions config.openshift.io/v1/ClusterOperator --helm-api-versions config.openshift.io/v1/ClusterVersion --helm-api-versions config.openshift.io/v1/Console --helm-api-versions config.openshift.io/v1/DNS --helm-api-versions config.openshift.io/v1/FeatureGate --helm-api-versions config.openshift.io/v1/Image --helm-api-versions config.openshift.io/v1/ImageContentPolicy --helm-api-versions config.openshift.io/v1/ImageDigestMirrorSet --helm-api-versions config.openshift.io/v1/ImageTagMirrorSet --helm-api-versions config.openshift.io/v1/Infrastructure --helm-api-versions config.openshift.io/v1/Ingress --helm-api-versions config.openshift.io/v1/Network --helm-api-versions config.openshift.io/v1/Node --helm-api-versions config.openshift.io/v1/OAuth --helm-api-versions config.openshift.io/v1/OperatorHub --helm-api-versions config.openshift.io/v1/Project --helm-api-versions config.openshift.io/v1/Proxy --helm-api-versions config.openshift.io/v1/Scheduler --helm-api-versions console.openshift.io/v1 --helm-api-versions console.openshift.io/v1/ConsoleCLIDownload --helm-api-versions console.openshift.io/v1/ConsoleExternalLogLink --helm-api-versions console.openshift.io/v1/ConsoleLink --helm-api-versions console.openshift.io/v1/ConsoleNotification --helm-api-versions console.openshift.io/v1/ConsolePlugin --helm-api-versions console.openshift.io/v1/ConsoleQuickStart --helm-api-versions console.openshift.io/v1/ConsoleSample --helm-api-versions console.openshift.io/v1/ConsoleYAMLSample --helm-api-versions console.openshift.io/v1alpha1 --helm-api-versions console.openshift.io/v1alpha1/ConsolePlugin --helm-api-versions controlplane.operator.openshift.io/v1alpha1 --helm-api-versions controlplane.operator.openshift.io/v1alpha1/PodNetworkConnectivityCheck --helm-api-versions coordination.k8s.io/v1 --helm-api-versions coordination.k8s.io/v1/Lease --helm-api-versions discovery.k8s.io/v1 --helm-api-versions discovery.k8s.io/v1/EndpointSlice --helm-api-versions events.k8s.io/v1 --helm-api-versions events.k8s.io/v1/Event --helm-api-versions flowcontrol.apiserver.k8s.io/v1 --helm-api-versions flowcontrol.apiserver.k8s.io/v1/FlowSchema --helm-api-versions flowcontrol.apiserver.k8s.io/v1/PriorityLevelConfiguration --helm-api-versions flowcontrol.apiserver.k8s.io/v1beta3 --helm-api-versions flowcontrol.apiserver.k8s.io/v1beta3/FlowSchema --helm-api-versions flowcontrol.apiserver.k8s.io/v1beta3/PriorityLevelConfiguration --helm-api-versions helm.openshift.io/v1beta1 --helm-api-versions helm.openshift.io/v1beta1/HelmChartRepository --helm-api-versions helm.openshift.io/v1beta1/ProjectHelmChartRepository --helm-api-versions image.openshift.io/v1 --helm-api-versions image.openshift.io/v1/Image --helm-api-versions image.openshift.io/v1/ImageStream --helm-api-versions imageregistry.operator.openshift.io/v1 --helm-api-versions imageregistry.operator.openshift.io/v1/Config --helm-api-versions imageregistry.operator.openshift.io/v1/ImagePruner --helm-api-versions ingress.operator.openshift.io/v1 --helm-api-versions ingress.operator.openshift.io/v1/DNSRecord --helm-api-versions k8s.cni.cncf.io/v1 --helm-api-versions k8s.cni.cncf.io/v1/NetworkAttachmentDefinition --helm-api-versions k8s.ovn.org/v1 --helm-api-versions k8s.ovn.org/v1/AdminPolicyBasedExternalRoute --helm-api-versions k8s.ovn.org/v1/EgressFirewall --helm-api-versions k8s.ovn.org/v1/EgressIP --helm-api-versions k8s.ovn.org/v1/EgressQoS --helm-api-versions k8s.ovn.org/v1/EgressService --helm-api-versions machineconfiguration.openshift.io/v1 --helm-api-versions machineconfiguration.openshift.io/v1/ControllerConfig --helm-api-versions migration.k8s.io/v1alpha1 --helm-api-versions migration.k8s.io/v1alpha1/StorageState --helm-api-versions migration.k8s.io/v1alpha1/StorageVersionMigration --helm-api-versions monitoring.coreos.com/v1 --helm-api-versions monitoring.coreos.com/v1/Alertmanager --helm-api-versions monitoring.coreos.com/v1/PodMonitor --helm-api-versions monitoring.coreos.com/v1/Probe --helm-api-versions monitoring.coreos.com/v1/Prometheus --helm-api-versions monitoring.coreos.com/v1/PrometheusRule --helm-api-versions monitoring.coreos.com/v1/ServiceMonitor --helm-api-versions monitoring.coreos.com/v1/ThanosRuler --helm-api-versions monitoring.coreos.com/v1alpha1 --helm-api-versions monitoring.coreos.com/v1alpha1/AlertmanagerConfig --helm-api-versions monitoring.coreos.com/v1beta1 --helm-api-versions monitoring.coreos.com/v1beta1/AlertmanagerConfig --helm-api-versions monitoring.openshift.io/v1 --helm-api-versions monitoring.openshift.io/v1/AlertRelabelConfig --helm-api-versions monitoring.openshift.io/v1/AlertingRule --helm-api-versions network.operator.openshift.io/v1 --helm-api-versions network.operator.openshift.io/v1/EgressRouter --helm-api-versions network.operator.openshift.io/v1/OperatorPKI --helm-api-versions networking.k8s.io/v1 --helm-api-versions networking.k8s.io/v1/Ingress --helm-api-versions networking.k8s.io/v1/IngressClass --helm-api-versions networking.k8s.io/v1/NetworkPolicy --helm-api-versions node.k8s.io/v1 --helm-api-versions node.k8s.io/v1/RuntimeClass --helm-api-versions oauth.openshift.io/v1 --helm-api-versions oauth.openshift.io/v1/OAuthAccessToken --helm-api-versions oauth.openshift.io/v1/OAuthAuthorizeToken --helm-api-versions oauth.openshift.io/v1/OAuthClient --helm-api-versions oauth.openshift.io/v1/OAuthClientAuthorization --helm-api-versions oauth.openshift.io/v1/UserOAuthAccessToken --helm-api-versions operator.openshift.io/v1 --helm-api-versions operator.openshift.io/v1/CSISnapshotController --helm-api-versions operator.openshift.io/v1/CloudCredential --helm-api-versions operator.openshift.io/v1/ClusterCSIDriver --helm-api-versions operator.openshift.io/v1/Config --helm-api-versions operator.openshift.io/v1/Console --helm-api-versions operator.openshift.io/v1/DNS --helm-api-versions operator.openshift.io/v1/Etcd --helm-api-versions operator.openshift.io/v1/IngressController --helm-api-versions operator.openshift.io/v1/InsightsOperator --helm-api-versions operator.openshift.io/v1/KubeAPIServer --helm-api-versions operator.openshift.io/v1/KubeControllerManager --helm-api-versions operator.openshift.io/v1/KubeScheduler --helm-api-versions operator.openshift.io/v1/KubeStorageVersionMigrator --helm-api-versions operator.openshift.io/v1/MachineConfiguration --helm-api-versions operator.openshift.io/v1/Network --helm-api-versions operator.openshift.io/v1/OpenShiftAPIServer --helm-api-versions operator.openshift.io/v1/OpenShiftControllerManager --helm-api-versions operator.openshift.io/v1/ServiceCA --helm-api-versions operator.openshift.io/v1/Storage --helm-api-versions operator.openshift.io/v1alpha1 --helm-api-versions operator.openshift.io/v1alpha1/CertManager --helm-api-versions operator.openshift.io/v1alpha1/ImageContentSourcePolicy --helm-api-versions operators.coreos.com/v1 --helm-api-versions operators.coreos.com/v1/OLMConfig --helm-api-versions operators.coreos.com/v1/Operator --helm-api-versions operators.coreos.com/v1/OperatorCondition --helm-api-versions operators.coreos.com/v1/OperatorGroup --helm-api-versions operators.coreos.com/v1alpha1 --helm-api-versions operators.coreos.com/v1alpha1/CatalogSource --helm-api-versions operators.coreos.com/v1alpha1/ClusterServiceVersion --helm-api-versions operators.coreos.com/v1alpha1/InstallPlan --helm-api-versions operators.coreos.com/v1alpha1/Subscription --helm-api-versions operators.coreos.com/v1alpha2 --helm-api-versions operators.coreos.com/v1alpha2/OperatorGroup --helm-api-versions operators.coreos.com/v2 --helm-api-versions operators.coreos.com/v2/OperatorCondition --helm-api-versions ops.crossplane.io/v1alpha1 --helm-api-versions ops.crossplane.io/v1alpha1/CronOperation --helm-api-versions ops.crossplane.io/v1alpha1/Operation --helm-api-versions ops.crossplane.io/v1alpha1/WatchOperation --helm-api-versions package-operator.run/v1alpha1 --helm-api-versions package-operator.run/v1alpha1/ClusterObjectDeployment --helm-api-versions package-operator.run/v1alpha1/ClusterObjectSet --helm-api-versions package-operator.run/v1alpha1/ClusterObjectSetPhase --helm-api-versions package-operator.run/v1alpha1/ClusterObjectSlice --helm-api-versions package-operator.run/v1alpha1/ClusterObjectTemplate --helm-api-versions package-operator.run/v1alpha1/ClusterPackage --helm-api-versions package-operator.run/v1alpha1/ObjectDeployment --helm-api-versions package-operator.run/v1alpha1/ObjectSet --helm-api-versions package-operator.run/v1alpha1/ObjectSetPhase --helm-api-versions package-operator.run/v1alpha1/ObjectSlice --helm-api-versions package-operator.run/v1alpha1/ObjectTemplate --helm-api-versions package-operator.run/v1alpha1/Package --helm-api-versions pipelines.openshift.io/v1alpha1 --helm-api-versions pipelines.openshift.io/v1alpha1/GitopsService --helm-api-versions pkg.crossplane.io/v1 --helm-api-versions pkg.crossplane.io/v1/Configuration --helm-api-versions pkg.crossplane.io/v1/ConfigurationRevision --helm-api-versions pkg.crossplane.io/v1/Function --helm-api-versions pkg.crossplane.io/v1/FunctionRevision --helm-api-versions pkg.crossplane.io/v1/Provider --helm-api-versions pkg.crossplane.io/v1/ProviderRevision --helm-api-versions pkg.crossplane.io/v1beta1 --helm-api-versions pkg.crossplane.io/v1beta1/DeploymentRuntimeConfig --helm-api-versions pkg.crossplane.io/v1beta1/Function --helm-api-versions pkg.crossplane.io/v1beta1/FunctionRevision --helm-api-versions pkg.crossplane.io/v1beta1/ImageConfig --helm-api-versions pkg.crossplane.io/v1beta1/Lock --helm-api-versions policy.networking.k8s.io/v1alpha1 --helm-api-versions policy.networking.k8s.io/v1alpha1/AdminNetworkPolicy --helm-api-versions policy.networking.k8s.io/v1alpha1/BaselineAdminNetworkPolicy --helm-api-versions policy/v1 --helm-api-versions policy/v1/PodDisruptionBudget --helm-api-versions projctl.konflux.dev/v1beta1 --helm-api-versions projctl.konflux.dev/v1beta1/Project --helm-api-versions projctl.konflux.dev/v1beta1/ProjectDevelopmentStream --helm-api-versions projctl.konflux.dev/v1beta1/ProjectDevelopmentStreamTemplate --helm-api-versions project.openshift.io/v1 --helm-api-versions project.openshift.io/v1/Project --helm-api-versions protection.crossplane.io/v1beta1 --helm-api-versions protection.crossplane.io/v1beta1/ClusterUsage --helm-api-versions protection.crossplane.io/v1beta1/Usage --helm-api-versions quota.openshift.io/v1 --helm-api-versions quota.openshift.io/v1/ClusterResourceQuota --helm-api-versions rbac.authorization.k8s.io/v1 --helm-api-versions rbac.authorization.k8s.io/v1/ClusterRole --helm-api-versions rbac.authorization.k8s.io/v1/ClusterRoleBinding --helm-api-versions rbac.authorization.k8s.io/v1/Role --helm-api-versions rbac.authorization.k8s.io/v1/RoleBinding --helm-api-versions route.openshift.io/v1 --helm-api-versions route.openshift.io/v1/Route --helm-api-versions samples.operator.openshift.io/v1 --helm-api-versions samples.operator.openshift.io/v1/Config --helm-api-versions scheduling.k8s.io/v1 --helm-api-versions scheduling.k8s.io/v1/PriorityClass --helm-api-versions security.internal.openshift.io/v1 --helm-api-versions security.internal.openshift.io/v1/RangeAllocation --helm-api-versions security.openshift.io/v1 --helm-api-versions security.openshift.io/v1/RangeAllocation --helm-api-versions security.openshift.io/v1/SecurityContextConstraints --helm-api-versions snapshot.storage.k8s.io/v1 --helm-api-versions snapshot.storage.k8s.io/v1/VolumeSnapshot --helm-api-versions snapshot.storage.k8s.io/v1/VolumeSnapshotClass --helm-api-versions snapshot.storage.k8s.io/v1/VolumeSnapshotContent --helm-api-versions storage.k8s.io/v1 --helm-api-versions storage.k8s.io/v1/CSIDriver --helm-api-versions storage.k8s.io/v1/CSINode --helm-api-versions storage.k8s.io/v1/CSIStorageCapacity --helm-api-versions storage.k8s.io/v1/StorageClass --helm-api-versions storage.k8s.io/v1/VolumeAttachment --helm-api-versions template.openshift.io/v1 --helm-api-versions template.openshift.io/v1/BrokerTemplateInstance --helm-api-versions template.openshift.io/v1/Template --helm-api-versions template.openshift.io/v1/TemplateInstance --helm-api-versions tempo.grafana.com/v1alpha1 --helm-api-versions tempo.grafana.com/v1alpha1/TempoMonolithic --helm-api-versions tempo.grafana.com/v1alpha1/TempoStack --helm-api-versions tuned.openshift.io/v1 --helm-api-versions tuned.openshift.io/v1/Profile --helm-api-versions tuned.openshift.io/v1/Tuned --helm-api-versions user.openshift.io/v1 --helm-api-versions user.openshift.io/v1/Group --helm-api-versions user.openshift.io/v1/Identity --helm-api-versions user.openshift.io/v1/User --helm-api-versions v1 --helm-api-versions v1/ConfigMap --helm-api-versions v1/Endpoints --helm-api-versions v1/Event --helm-api-versions v1/LimitRange --helm-api-versions v1/Namespace --helm-api-versions v1/Node --helm-api-versions v1/PersistentVolume --helm-api-versions v1/PersistentVolumeClaim --helm-api-versions v1/Pod --helm-api-versions v1/PodTemplate --helm-api-versions v1/ReplicationController --helm-api-versions v1/ResourceQuota --helm-api-versions v1/Secret --helm-api-versions v1/Service --helm-api-versions v1/ServiceAccount --helm-api-versions whereabouts.cni.cncf.io/v1alpha1 --helm-api-versions whereabouts.cni.cncf.io/v1alpha1/IPPool --helm-api-versions whereabouts.cni.cncf.io/v1alpha1/NodeSlicePool --helm-api-versions whereabouts.cni.cncf.io/v1alpha1/OverlappingRangeIPReservation --helm-api-versions work.open-cluster-management.io/v1 --helm-api-versions work.open-cluster-management.io/v1/AppliedManifestWork` failed exit status 1: Error: Error: failed to untar: a file or directory with the name \u003cpath to cached source\u003e/components/vector-kubearchive-log-collector/development/charts/loki-6.30.1/loki-6.30.1.tgz already exists\n: unable to run: 'helm pull --untar --untardir \u003cpath to cached source\u003e/components/vector-kubearchive-log-collector/development/charts/loki-6.30.1 --repo https://grafana.github.io/helm-charts loki --version 6.30.1' with env=[HELM_CONFIG_HOME=/tmp/kustomize-helm-2851391009/helm HELM_CACHE_HOME=/tmp/kustomize-helm-2851391009/helm/.cache HELM_DATA_HOME=/tmp/kustomize-helm-2851391009/helm/.data] (is 'helm' installed?): exit status 1","type":"ComparisonError"}] Switched to branch 'main' Your branch is up to date with 'upstream/main'. I1105 19:45:16.116809 16569 common.go:283] got an error: exit status 1 - will retry in 10s [controller-runtime] log.SetLogger(...) was never called; logs will not be displayed. Detected at: > goroutine 129 [running]: > runtime/debug.Stack() > /usr/lib/golang/src/runtime/debug/stack.go:26 +0x5e > sigs.k8s.io/controller-runtime/pkg/log.eventuallyFulfillRoot() > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/log.go:60 +0xcd > sigs.k8s.io/controller-runtime/pkg/log.(*delegatingLogSink).WithName(0xc0005107c0, {0x2f94a36, 0x14}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/log/deleg.go:147 +0x3e > github.com/go-logr/logr.Logger.WithName({{0x36ee510, 0xc0005107c0}, 0x0}, {0x2f94a36?, 0x0?}) > /opt/app-root/src/go/pkg/mod/github.com/go-logr/logr@v1.4.2/logr.go:345 +0x36 > sigs.k8s.io/controller-runtime/pkg/client.newClient(0x2d72320?, {0x0, 0xc000522a10, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:129 +0xf1 > sigs.k8s.io/controller-runtime/pkg/client.New(0xc000e18d88?, {0x0, 0xc000522a10, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0}) > /opt/app-root/src/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.6/pkg/client/client.go:110 +0x7d > github.com/konflux-ci/e2e-tests/pkg/clients/kubernetes.NewAdminKubernetesClient() > /tmp/tmp.VY0UsCOJ69/pkg/clients/kubernetes/client.go:157 +0xa5 > github.com/konflux-ci/e2e-tests/magefiles/installation.NewAppStudioInstallController() > /tmp/tmp.VY0UsCOJ69/magefiles/installation/install.go:98 +0x31 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.InstallKonflux() > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/repos/common.go:267 +0x13 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.retry(0x329d968, 0x2, 0x2540be400) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/repos/common.go:286 +0xf9 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine/repos.init.func7(0x36b7d00?) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/repos/common.go:360 +0xae > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.ActionFunc.Execute(0xc?, 0x2f6f495?) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:279 +0x19 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Apply(...) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:315 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x52461c0, 0xc000a59b08) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:348 +0xb3 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x523eca0?, 0xc001583c00?, 0x1f1bc99?}, 0xc000a59b08) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Check(0x52462e0, 0xc000a59b08) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:340 +0x2b > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.All.Check({0x5247f60?, 0x7fcee38d55c0?, 0x70?}, 0xc000a59b08) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:245 +0x4f > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*Rule).Eval(...) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:308 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).runLoadedCatalog(0x527d990, {0xc00147e508?, 0xc001155e60?, 0x47?}, 0xc000a59b08) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:129 +0x119 > github.com/konflux-ci/e2e-tests/magefiles/rulesengine.(*RuleEngine).RunRulesOfCategory(0x527d990, {0x2f69843, 0x2}, 0xc000a59b08) > /tmp/tmp.VY0UsCOJ69/magefiles/rulesengine/types.go:121 +0x1b4 > main.CI.TestE2E({}) > /tmp/tmp.VY0UsCOJ69/magefiles/magefile.go:330 +0x18a > main.main.func19({0xc00074d0a0?, 0x0?}) > /tmp/tmp.VY0UsCOJ69/magefiles/mage_output_file.go:827 +0xf > main.main.func12.1() > /tmp/tmp.VY0UsCOJ69/magefiles/mage_output_file.go:302 +0x5b > created by main.main.func12 in goroutine 1 > /tmp/tmp.VY0UsCOJ69/magefiles/mage_output_file.go:297 +0xbe W1105 19:45:26.118492 16569 install.go:178] folder /tmp/tmp.VY0UsCOJ69/tmp/infra-deployments already exists... removing I1105 19:45:26.225337 16569 install.go:188] cloning 'https://github.com/redhat-appstudio/infra-deployments' with git ref 'refs/heads/main' Enumerating objects: 67469, done. Counting objects: 0% (1/153) Counting objects: 1% (2/153) Counting objects: 2% (4/153) Counting objects: 3% (5/153) Counting objects: 4% (7/153) Counting objects: 5% (8/153) Counting objects: 6% (10/153) Counting objects: 7% (11/153) Counting objects: 8% (13/153) Counting objects: 9% (14/153) Counting objects: 10% (16/153) Counting objects: 11% (17/153) Counting objects: 12% (19/153) Counting objects: 13% (20/153) Counting objects: 14% (22/153) Counting objects: 15% (23/153) Counting objects: 16% (25/153) Counting objects: 17% (27/153) Counting objects: 18% (28/153) Counting objects: 19% (30/153) Counting objects: 20% (31/153) Counting objects: 21% (33/153) Counting objects: 22% (34/153) Counting objects: 23% (36/153) Counting objects: 24% (37/153) Counting objects: 25% (39/153) Counting objects: 26% (40/153) Counting objects: 27% (42/153) Counting objects: 28% (43/153) Counting objects: 29% (45/153) Counting objects: 30% (46/153) Counting objects: 31% (48/153) Counting objects: 32% (49/153) Counting objects: 33% (51/153) Counting objects: 34% (53/153) Counting objects: 35% (54/153) Counting objects: 36% (56/153) Counting objects: 37% (57/153) Counting objects: 38% (59/153) Counting objects: 39% (60/153) Counting objects: 40% (62/153) Counting objects: 41% (63/153) Counting objects: 42% (65/153) Counting objects: 43% (66/153) Counting objects: 44% (68/153) Counting objects: 45% (69/153) Counting objects: 46% (71/153) Counting objects: 47% (72/153) Counting objects: 48% (74/153) Counting objects: 49% (75/153) Counting objects: 50% (77/153) Counting objects: 51% (79/153) Counting objects: 52% (80/153) Counting objects: 53% (82/153) Counting objects: 54% (83/153) Counting objects: 55% (85/153) Counting objects: 56% (86/153) Counting objects: 57% (88/153) Counting objects: 58% (89/153) Counting objects: 59% (91/153) Counting objects: 60% (92/153) Counting objects: 61% (94/153) Counting objects: 62% (95/153) Counting objects: 63% (97/153) Counting objects: 64% (98/153) Counting objects: 65% (100/153) Counting objects: 66% (101/153) Counting objects: 67% (103/153) Counting objects: 68% (105/153) Counting objects: 69% (106/153) Counting objects: 70% (108/153) Counting objects: 71% (109/153) Counting objects: 72% (111/153) Counting objects: 73% (112/153) Counting objects: 74% (114/153) Counting objects: 75% (115/153) Counting objects: 76% (117/153) Counting objects: 77% (118/153) Counting objects: 78% (120/153) Counting objects: 79% (121/153) Counting objects: 80% (123/153) Counting objects: 81% (124/153) Counting objects: 82% (126/153) Counting objects: 83% (127/153) Counting objects: 84% (129/153) Counting objects: 85% (131/153) Counting objects: 86% (132/153) Counting objects: 87% (134/153) Counting objects: 88% (135/153) Counting objects: 89% (137/153) Counting objects: 90% (138/153) Counting objects: 91% (140/153) Counting objects: 92% (141/153) Counting objects: 93% (143/153) Counting objects: 94% (144/153) Counting objects: 95% (146/153) Counting objects: 96% (147/153) Counting objects: 97% (149/153) Counting objects: 98% (150/153) Counting objects: 99% (152/153) Counting objects: 100% (153/153) Counting objects: 100% (153/153), done. Compressing objects: 1% (1/82) Compressing objects: 2% (2/82) Compressing objects: 3% (3/82) Compressing objects: 4% (4/82) Compressing objects: 6% (5/82) Compressing objects: 7% (6/82) Compressing objects: 8% (7/82) Compressing objects: 9% (8/82) Compressing objects: 10% (9/82) Compressing objects: 12% (10/82) Compressing objects: 13% (11/82) Compressing objects: 14% (12/82) Compressing objects: 15% (13/82) Compressing objects: 17% (14/82) Compressing objects: 18% (15/82) Compressing objects: 19% (16/82) Compressing objects: 20% (17/82) Compressing objects: 21% (18/82) Compressing objects: 23% (19/82) Compressing objects: 24% (20/82) Compressing objects: 25% (21/82) Compressing objects: 26% (22/82) Compressing objects: 28% (23/82) Compressing objects: 29% (24/82) Compressing objects: 30% (25/82) Compressing objects: 31% (26/82) Compressing objects: 32% (27/82) Compressing objects: 34% (28/82) Compressing objects: 35% (29/82) Compressing objects: 36% (30/82) Compressing objects: 37% (31/82) Compressing objects: 39% (32/82) Compressing objects: 40% (33/82) Compressing objects: 41% (34/82) Compressing objects: 42% (35/82) Compressing objects: 43% (36/82) Compressing objects: 45% (37/82) Compressing objects: 46% (38/82) Compressing objects: 47% (39/82) Compressing objects: 48% (40/82) Compressing objects: 50% (41/82) Compressing objects: 51% (42/82) Compressing objects: 52% (43/82) Compressing objects: 53% (44/82) Compressing objects: 54% (45/82) Compressing objects: 56% (46/82) Compressing objects: 57% (47/82) Compressing objects: 58% (48/82) Compressing objects: 59% (49/82) Compressing objects: 60% (50/82) Compressing objects: 62% (51/82) Compressing objects: 63% (52/82) Compressing objects: 64% (53/82) Compressing objects: 65% (54/82) Compressing objects: 67% (55/82) Compressing objects: 68% (56/82) Compressing objects: 69% (57/82) Compressing objects: 70% (58/82) Compressing objects: 71% (59/82) Compressing objects: 73% (60/82) Compressing objects: 74% (61/82) Compressing objects: 75% (62/82) Compressing objects: 76% (63/82) Compressing objects: 78% (64/82) Compressing objects: 79% (65/82) Compressing objects: 80% (66/82) Compressing objects: 81% (67/82) Compressing objects: 82% (68/82) Compressing objects: 84% (69/82) Compressing objects: 85% (70/82) Compressing objects: 86% (71/82) Compressing objects: 87% (72/82) Compressing objects: 89% (73/82) Compressing objects: 90% (74/82) Compressing objects: 91% (75/82) Compressing objects: 92% (76/82) Compressing objects: 93% (77/82) Compressing objects: 95% (78/82) Compressing objects: 96% (79/82) Compressing objects: 97% (80/82) Compressing objects: 98% (81/82) Compressing objects: 100% (82/82) Compressing objects: 100% (82/82), done. Total 67469 (delta 111), reused 83 (delta 70), pack-reused 67316 (from 5) From https://github.com/redhat-appstudio/infra-deployments * branch main -> FETCH_HEAD Already up to date. Installing the OpenShift GitOps operator subscription: clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller unchanged clusterrole.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server unchanged clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-application-controller unchanged clusterrolebinding.rbac.authorization.k8s.io/appstudio-openshift-gitops-argocd-server unchanged subscription.operators.coreos.com/openshift-gitops-operator unchanged Waiting for default project (and namespace) to exist: OK Waiting for OpenShift GitOps Route: OK argocd.argoproj.io/openshift-gitops patched (no change) argocd.argoproj.io/openshift-gitops patched (no change) Switch the Route to use re-encryption argocd.argoproj.io/openshift-gitops patched (no change) Restarting ArgoCD Server pod "openshift-gitops-server-5d6d7ccb9f-2jfwn" deleted Allow any authenticated users to be admin on the Argo CD instance argocd.argoproj.io/openshift-gitops patched Mark Pending PVC as Healthy, workaround for WaitForFirstConsumer StorageClasses. Warning: unknown field "spec.resourceCustomizations" argocd.argoproj.io/openshift-gitops patched (no change) Setting kustomize build options argocd.argoproj.io/openshift-gitops patched (no change) Setting ignore Aggregated Roles argocd.argoproj.io/openshift-gitops patched (no change) Setting ArgoCD tracking method to annotation argocd.argoproj.io/openshift-gitops patched (no change) Restarting GitOps server deployment.apps/openshift-gitops-server restarted ========================================================================= Argo CD URL is: https://openshift-gitops-server-openshift-gitops.apps.rosa.kx-6f23184ad7.4331.p3.openshiftapps.com (NOTE: It may take a few moments for the route to become available) Waiting for the route: ..........OK Login/password uses your OpenShift credentials ('Login with OpenShift' button) Setting secrets for Quality Dashboard namespace/quality-dashboard configured Creating secret for CI Helper App namespace/ci-helper-app configured Setting secrets for pipeline-service tekton-results namespace already exists, skipping creation tekton-logging namespace already exists, skipping creation product-kubearchive-logging namespace already exists, skipping creation Creating DB secret DB secret already exists, skipping creation Creating S3 secret S3 secret already exists, skipping creation Creating S3 secret S3 secret already exists, skipping creation Creating Postgres TLS certs Postgres DB cert secret already exists, skipping creation namespace/application-service configured Creating a has secret from legacy token secret/has-github-token configured Creating a secret with a token for Image Controller namespace/image-controller configured secret/quaytoken configured Configuring the cluster with a pull secret for Docker Hub Saved credentials for docker.io into /tmp/tmp.cJzzsnypMH secret/pull-secret data updated Saved credentials for docker.io into /tmp/tmp.cJzzsnypMH secret/docker-io-pull configured Setting secrets for Dora metrics exporter namespace/dora-metrics configured Setting Cluster Mode: preview Switched to a new branch 'preview-main-drkm' labeling node/ip-10-0-137-225.ec2.internal... node/ip-10-0-137-225.ec2.internal not labeled successfully labeled node/ip-10-0-137-225.ec2.internal labeling node/ip-10-0-153-179.ec2.internal... node/ip-10-0-153-179.ec2.internal not labeled successfully labeled node/ip-10-0-153-179.ec2.internal labeling node/ip-10-0-165-237.ec2.internal... node/ip-10-0-165-237.ec2.internal not labeled successfully labeled node/ip-10-0-165-237.ec2.internal verifying labels... all nodes labeled successfully. Detected OCP minor version: 17 Changing AppStudio Gitlab Org to "redhat-appstudio-qe" [preview-main-drkm f995c3462] Preview mode, do not merge into main 6 files changed, 12 insertions(+), 18 deletions(-) remote: remote: Create a pull request for 'preview-main-drkm' on GitHub by visiting: remote: https://github.com/redhat-appstudio-qe/infra-deployments/pull/new/preview-main-drkm remote: To https://github.com/redhat-appstudio-qe/infra-deployments.git * [new branch] preview-main-drkm -> preview-main-drkm branch 'preview-main-drkm' set up to track 'qe/preview-main-drkm'. application.argoproj.io/all-application-sets configured application.argoproj.io/cert-manager-in-cluster-local patched application.argoproj.io/build-service-in-cluster-local patched application.argoproj.io/konflux-rbac-in-cluster-local patched application.argoproj.io/tracing-workload-tracing-in-cluster-local patched application.argoproj.io/pipeline-service-in-cluster-local patched application.argoproj.io/enterprise-contract-in-cluster-local patched application.argoproj.io/build-templates-in-cluster-local patched application.argoproj.io/image-controller-in-cluster-local patched application.argoproj.io/perf-team-prometheus-reader-in-cluster-local patched application.argoproj.io/postgres patched application.argoproj.io/tracing-workload-otel-collector-in-cluster-local patched application.argoproj.io/all-application-sets patched application.argoproj.io/application-api-in-cluster-local patched application.argoproj.io/multi-platform-controller-in-cluster-local patched application.argoproj.io/project-controller-in-cluster-local patched application.argoproj.io/kubearchive-in-cluster-local patched application.argoproj.io/mintmaker-in-cluster-local patched application.argoproj.io/kyverno-in-cluster-local patched application.argoproj.io/crossplane-control-plane-in-cluster-local patched application.argoproj.io/squid-in-cluster-local patched application.argoproj.io/has-in-cluster-local patched application.argoproj.io/dora-metrics-in-cluster-local patched application.argoproj.io/knative-eventing-in-cluster-local patched application.argoproj.io/monitoring-workload-grafana-in-cluster-local patched application.argoproj.io/monitoring-workload-prometheus-in-cluster-local patched application.argoproj.io/trust-manager-in-cluster-local patched application.argoproj.io/konflux-kite-in-cluster-local patched application.argoproj.io/integration-in-cluster-local patched application.argoproj.io/vector-tekton-logs-collector-in-cluster-local patched application.argoproj.io/repository-validator-in-cluster-local patched application.argoproj.io/disable-csvcopy-in-cluster-local patched application.argoproj.io/vector-kubearchive-log-collector-in-cluster-local patched (no change) application.argoproj.io/release-in-cluster-local patched application.argoproj.io/tempo-in-cluster-local patched application.argoproj.io/policies-in-cluster-local patched application.argoproj.io/kueue-in-cluster-local patched application.argoproj.io/internal-services-in-cluster-local patched All Applications are synced and Healthy All required tekton resources are installed and ready Tekton CRDs are ready Setup Pac with existing QE sprayproxy and github App namespace/openshift-pipelines configured namespace/build-service configured namespace/integration-service configured secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created secret/pipelines-as-code-secret created Configured pipelines-as-code-secret secret in openshift-pipelines namespace Switched to branch 'main' Your branch is up to date with 'upstream/main'. I1105 19:49:22.990748 16569 common.go:434] Registered PaC server: https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-6f23184ad7.4331.p3.openshiftapps.com I1105 19:49:23.060489 16569 common.go:459] The PaC servers registered in Sprayproxy: https://pipelines-as-code-controller-openshift-pipelines.apps.konflux-4-17-us-west-2-k9vn8.konflux-qe.devcluster.openshift.com, https://pipelines-as-code-controller-openshift-pipelines.apps.rosa.kx-6f23184ad7.4331.p3.openshiftapps.com, https://pipelines-as-code-controller-openshift-pipelines.apps.konflux-4-18-us-west-2-5hxlg.konflux-qe.devcluster.openshift.com I1105 19:49:23.060516 16569 common.go:475] going to create new Tekton bundle remote-build for the purpose of testing multi-platform-controller PR I1105 19:49:23.626141 16569 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:563ca1782a57a802491e3fc1eba3c7ad7dbe29297e376ad738dc22b6bc39a672 I1105 19:49:23.628064 16569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372163-uikx -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1105 19:49:24.719465 16569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372163-uikx: quay.io/redhat-appstudio-qe/test-images@sha256:bd17fffb90b3b91b5dee7f830a5b147be4c41ef96e98ecf5bb616ae87ed0bfd3 I1105 19:49:24.719495 16569 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_ARM64 to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372163-uikx I1105 19:49:24.944353 16569 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:563ca1782a57a802491e3fc1eba3c7ad7dbe29297e376ad738dc22b6bc39a672 I1105 19:49:24.946323 16569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372164-bzgx -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1105 19:49:26.237208 16569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372164-bzgx: quay.io/redhat-appstudio-qe/test-images@sha256:bd5b81da148e32ab4cb3508abd41982a9515a0634fa19a86275e90ab79e8afa0 I1105 19:49:26.237239 16569 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_S390X to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372164-bzgx I1105 19:49:26.698543 16569 common.go:516] Found current task ref quay.io/konflux-ci/tekton-catalog/task-buildah:0.6@sha256:563ca1782a57a802491e3fc1eba3c7ad7dbe29297e376ad738dc22b6bc39a672 I1105 19:49:26.701821 16569 util.go:512] found credentials for image ref quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372166-wjwu -> user: redhat-appstudio-qe+redhat_appstudio_quality Creating Tekton Bundle: - Added Pipeline: buildah-remote-pipeline to image I1105 19:49:27.883293 16569 bundle.go:57] image digest for a new tekton bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372166-wjwu: quay.io/redhat-appstudio-qe/test-images@sha256:90426fff925894f64f62a38ce39b463f481b16f2d078903114c1b5548339ccd5 I1105 19:49:27.883327 16569 common.go:542] SETTING ENV VAR CUSTOM_BUILDAH_REMOTE_PIPELINE_BUILD_BUNDLE_PPC64LE to value quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372166-wjwu exec: ginkgo "--seed=1762371621" "--timeout=1h30m0s" "--grace-period=30s" "--output-interceptor-mode=none" "--label-filter=!upgrade-create && !upgrade-verify && !upgrade-cleanup && !release-pipelines" "--no-color" "--json-report=e2e-report.json" "--junit-report=e2e-report.xml" "--procs=20" "--nodes=20" "--p" "--output-dir=/workspace/artifact-dir" "./cmd" "--" go: downloading github.com/konflux-ci/build-service v0.0.0-20240611083846-2dee6cfe6fe4 go: downloading github.com/IBM/go-sdk-core/v5 v5.15.3 go: downloading github.com/IBM/vpc-go-sdk v0.48.0 go: downloading github.com/aws/aws-sdk-go-v2 v1.32.7 go: downloading github.com/aws/aws-sdk-go-v2/service/ec2 v1.135.0 go: downloading github.com/aws/aws-sdk-go-v2/config v1.28.7 go: downloading github.com/aws/smithy-go v1.22.1 go: downloading github.com/aws/aws-sdk-go-v2/credentials v1.17.48 go: downloading github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.22 go: downloading github.com/aws/aws-sdk-go-v2/internal/ini v1.8.1 go: downloading github.com/aws/aws-sdk-go-v2/service/sso v1.24.8 go: downloading github.com/aws/aws-sdk-go-v2/service/ssooidc v1.28.7 go: downloading github.com/aws/aws-sdk-go-v2/service/sts v1.33.3 go: downloading github.com/go-openapi/strfmt v0.22.0 go: downloading github.com/go-playground/validator/v10 v10.17.0 go: downloading github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.26 go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 go: downloading github.com/mitchellh/mapstructure v1.5.0 go: downloading github.com/go-openapi/errors v0.21.0 go: downloading github.com/oklog/ulid v1.3.1 go: downloading go.mongodb.org/mongo-driver v1.13.1 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.1 go: downloading github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.7 go: downloading github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.26 go: downloading github.com/google/go-github/v45 v45.2.0 go: downloading github.com/leodido/go-urn v1.3.0 go: downloading github.com/gabriel-vasile/mimetype v1.4.3 go: downloading github.com/go-playground/universal-translator v0.18.1 go: downloading github.com/go-playground/locales v0.14.1 Running Suite: Red Hat App Studio E2E tests - /tmp/tmp.VY0UsCOJ69/cmd ===================================================================== Random Seed: 1762371621 Will run 312 of 390 specs Running in parallel across 20 processes S ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if release CR is created [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.VY0UsCOJ69/tests/release/pipelines/release_to_github.go:139 ------------------------------ SSS ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies the release pipelinerun is running and succeeds [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.VY0UsCOJ69/tests/release/pipelines/release_to_github.go:149 ------------------------------ SSSS ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies release CR completed and set succeeded. [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.VY0UsCOJ69/tests/release/pipelines/release_to_github.go:182 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-host-pool] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:120 ------------------------------ SS ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-host-pool] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:124 ------------------------------ S ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-host-pool] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:127 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-host-pool] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:148 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws host-pool allocation when the Component with multi-platform-build is created test that cleanup happened successfully [multi-platform, aws-host-pool] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:152 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, aws-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:251 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, aws-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:255 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, aws-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:259 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, aws-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:263 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] aws dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, aws-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:267 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmz-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:341 ------------------------------ P [PENDING] [release-pipelines-suite e2e tests for release-to-github pipeline] Release-to-github happy path Post-release verification verifies if the Release exists in github repo [release-pipelines, release-to-github, releaseToGithub] /tmp/tmp.VY0UsCOJ69/tests/release/pipelines/release_to_github.go:193 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmz-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:345 ------------------------------ S ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmz-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:349 ------------------------------ SS ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmz-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:353 ------------------------------ S ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm system z dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmz-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:357 ------------------------------ SS ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created a PipelineRun is triggered [multi-platform, ibmp-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:432 ------------------------------ S ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created the build-container task from component pipelinerun is buildah-remote [multi-platform, ibmp-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:436 ------------------------------ S ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created The multi platform secret is populated [multi-platform, ibmp-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:440 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created that PipelineRun completes successfully [multi-platform, ibmp-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:444 ------------------------------ P [PENDING] [multi-platform-build-service-suite Multi Platform Controller E2E tests] ibm power pc dynamic allocation when the Component with multi-platform-build is created check cleanup happened successfully [multi-platform, ibmp-dynamic] /tmp/tmp.VY0UsCOJ69/tests/build/multi-platform.go:448 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context points to a file [build-templates] /tmp/tmp.VY0UsCOJ69/tests/build/tkn-bundle.go:177 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles from specific context [build-templates] /tmp/tmp.VY0UsCOJ69/tests/build/tkn-bundle.go:188 ------------------------------ SS ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params when context is the root directory [build-templates] /tmp/tmp.VY0UsCOJ69/tests/build/tkn-bundle.go:198 ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when context points to a file and a directory [build-templates] /tmp/tmp.VY0UsCOJ69/tests/build/tkn-bundle.go:207 ------------------------------ S ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params creates Tekton bundles when using negation [build-templates] /tmp/tmp.VY0UsCOJ69/tests/build/tkn-bundle.go:217 ------------------------------ S ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding HOME environment variable [build-templates] /tmp/tmp.VY0UsCOJ69/tests/build/tkn-bundle.go:227 ------------------------------ S ------------------------------ P [PENDING] [task-suite tkn bundle task] creates Tekton bundles with different params allows overriding STEP image [build-templates] /tmp/tmp.VY0UsCOJ69/tests/build/tkn-bundle.go:236 ------------------------------ SSSSSSSSSSSSSSSSSSSS••••••••• ------------------------------ • [FAILED] [65.632 seconds] [build-service-suite Build templates E2E test] HACBS pipelines [BeforeAll] triggers PipelineRun for symlink component with source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic with component name test-symlink-comp-kclg [build, build-templates, HACBS, pipeline-service, pipeline, build-templates-e2e, source-build-e2e] [BeforeAll] /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:235 [It] /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:321 Timeline >> [FAILED] in [BeforeAll] - /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:261 @ 11/05/25 19:51:05.07 error while getting pipelineruns: no pipelinerun found for application test-app-ycbo error while getting pipelineruns: no pipelinerun found for application test-app-ycbo error while getting pipelineruns: no pipelinerun found for application test-app-ycbo error while getting pipelineruns: no pipelinerun found for application test-app-ycbo error while getting pipelineruns: no pipelinerun found for application test-app-ycbo error while getting pipelineruns: no pipelinerun found for application test-app-ycbo [FAILED] in [AfterAll] - /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:289 @ 11/05/25 19:52:05.07 << Timeline [FAILED] failed to create component for scenario: sample-python-basic-oci Unexpected error: <*errors.errorString | 0xc001530720>: failed to update BUILDAH_FORMAT in the pipeline bundle with: error when building/pushing a tekton pipeline bundle: error when pushing a bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372264-kxkw to a container image registry repo: could not push image to registry as "quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372264-kxkw": POST https://quay.io/v2/redhat-appstudio-qe/test-images/blobs/uploads/: UNAUTHORIZED: access to the requested resource is not authorized; map[] { s: "failed to update BUILDAH_FORMAT in the pipeline bundle with: error when building/pushing a tekton pipeline bundle: error when pushing a bundle quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372264-kxkw to a container image registry repo: could not push image to registry as \"quay.io/redhat-appstudio-qe/test-images:pipeline-bundle-1762372264-kxkw\": POST https://quay.io/v2/redhat-appstudio-qe/test-images/blobs/uploads/: UNAUTHORIZED: access to the requested resource is not authorized; map[]\n", } occurred In [BeforeAll] at: /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:261 @ 11/05/25 19:51:05.07 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ SSSSSSSS ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build-oci-ta should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:491 ------------------------------ SSSSSSSSSS ------------------------------ P [PENDING] [build-service-suite Build templates E2E test] HACBS pipelines scenario sample-python-basic-oci when Pipeline Results are stored for component with Git source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic and Pipeline docker-build should have Pipeline Logs [build, build-templates, HACBS, pipeline-service, pipeline] /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:491 ------------------------------ SSSSS•••••••••••••••••••••••••••••••••••••••••••••••••••••••• ------------------------------ P [PENDING] [build-service-suite Build service E2E tests] test build secret lookup when two secrets are created when second component is deleted, pac pr branch should not exist in the repo [build-service, pac-build, secret-lookup] /tmp/tmp.VY0UsCOJ69/tests/build/build.go:1121 ------------------------------ •••••••••••••••••••••••••••••••••••••• ------------------------------ • [FAILED] [1.738 seconds] [integration-service-suite Status Reporting of Integration tests] with status reporting of Integration tests in CheckRuns when a new Component with specified custom branch is created [It] eventually leads to the build PipelineRun's status reported at Checks tab [integration-service, github-status-reporting, custom-branch] /tmp/tmp.VY0UsCOJ69/tests/integration-service/status-reporting-to-pullrequest.go:144 [FAILED] Expected : failure to equal : success In [It] at: /tmp/tmp.VY0UsCOJ69/tests/integration-service/status-reporting-to-pullrequest.go:146 @ 11/05/25 20:00:17.237 ------------------------------ SSSSSSSSSSSSSSSS••• ------------------------------ • [FAILED] [1.677 seconds] [integration-service-suite Creation of group snapshots for monorepo and multiple repos] with status reporting of Integration tests in CheckRuns when we start creation of a new Component A [It] eventually leads to the build PipelineRun'sA status reported at Checks tab [integration-service, group-snapshot-creation] /tmp/tmp.VY0UsCOJ69/tests/integration-service/group-snapshots-tests.go:178 [FAILED] Expected : failure to equal : success In [It] at: /tmp/tmp.VY0UsCOJ69/tests/integration-service/group-snapshots-tests.go:180 @ 11/05/25 20:00:28.088 ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS•••••••••••••••••••••••••••• ------------------------------ • [FAILED] [1.740 seconds] [build-service-suite Build service E2E tests] test PaC component build github when the PaC init branch is updated [It] eventually leads to another update of a PR about the PipelineRun status report at Checks tab [build-service, github-webhook, pac-build, pipeline, image-controller, build-custom-branch] /tmp/tmp.VY0UsCOJ69/tests/build/build.go:514 [FAILED] Expected : failure to equal : success In [It] at: /tmp/tmp.VY0UsCOJ69/tests/build/build.go:518 @ 11/05/25 20:02:37.243 ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS•••••••••••••••••••••••••••••••••••••••••• ------------------------------ • [FAILED] [165.414 seconds] [build-service-suite Build service E2E tests] test of component update with renovate gitlab when components are created in same namespace [It] should lead to a nudge PR creation for child component [build-service, renovate, multi-component] /tmp/tmp.VY0UsCOJ69/tests/build/build.go:1607 [FAILED] Unexpected error: : stream error: stream ID 143; INTERNAL_ERROR; received from peer { StreamID: 143, Code: 2, Cause: <*errors.errorString | 0x540b310>{ s: "received from peer", }, } occurred In [It] at: /tmp/tmp.VY0UsCOJ69/tests/build/build.go:1613 @ 11/05/25 20:14:31.335 ------------------------------ SS•••••••••••••• Summarizing 5 Failures: [FAIL] [build-service-suite Build templates E2E test] HACBS pipelines [BeforeAll] triggers PipelineRun for symlink component with source URL https://github.com/redhat-appstudio-qe/devfile-sample-python-basic with component name test-symlink-comp-kclg [build, build-templates, HACBS, pipeline-service, pipeline, build-templates-e2e, source-build-e2e] /tmp/tmp.VY0UsCOJ69/tests/build/build_templates.go:261 [FAIL] [integration-service-suite Status Reporting of Integration tests] with status reporting of Integration tests in CheckRuns when a new Component with specified custom branch is created [It] eventually leads to the build PipelineRun's status reported at Checks tab [integration-service, github-status-reporting, custom-branch] /tmp/tmp.VY0UsCOJ69/tests/integration-service/status-reporting-to-pullrequest.go:146 [FAIL] [integration-service-suite Creation of group snapshots for monorepo and multiple repos] with status reporting of Integration tests in CheckRuns when we start creation of a new Component A [It] eventually leads to the build PipelineRun'sA status reported at Checks tab [integration-service, group-snapshot-creation] /tmp/tmp.VY0UsCOJ69/tests/integration-service/group-snapshots-tests.go:180 [FAIL] [build-service-suite Build service E2E tests] test of component update with renovate gitlab when components are created in same namespace [It] should lead to a nudge PR creation for child component [build-service, renovate, multi-component] /tmp/tmp.VY0UsCOJ69/tests/build/build.go:1613 [FAIL] [build-service-suite Build service E2E tests] test PaC component build github when the PaC init branch is updated [It] eventually leads to another update of a PR about the PipelineRun status report at Checks tab [build-service, github-webhook, pac-build, pipeline, image-controller, build-custom-branch] /tmp/tmp.VY0UsCOJ69/tests/build/build.go:518 Ran 195 of 390 Specs in 1973.386 seconds FAIL! -- 190 Passed | 5 Failed | 34 Pending | 161 Skipped Ginkgo ran 1 suite in 34m25.952443781s Test Suite Failed Error: running "ginkgo --seed=1762371621 --timeout=1h30m0s --grace-period=30s --output-interceptor-mode=none --label-filter=!upgrade-create && !upgrade-verify && !upgrade-cleanup && !release-pipelines --no-color --json-report=e2e-report.json --junit-report=e2e-report.xml --procs=20 --nodes=20 --p --output-dir=/workspace/artifact-dir ./cmd --" failed with exit code 1 make: *** [Makefile:25: ci/test/e2e] Error 1