Time Namespace Component RelatedObject Reason Message

kserve-ci-e2e-test

success-200-isvc-12404-predictor-564cf4d979-76fsj

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-12404-predictor-564cf4d979-76fsj to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-3a238-predictor-6d56d496bd-kgdhd to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8 to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-c1a35-predictor-869c6689b7-jfwbk to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-64806-predictor-5cd95b49d6-g7v8z to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-66d58bff49-kwvch to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-87d6c5875-x4cml to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-98aac-predictor-7df4fb9989-xwljl to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-75b0d-predictor-9549449fc-92fhr to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-64806-predictor-6bdc69944f-rmdz9 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

ensemble-graph-f3231-7b5968d9d6-rbjc2

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-f3231-7b5968d9d6-rbjc2 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

splitter-graph-12404-554567d986-qpd8n

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-12404-554567d986-qpd8n to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-f3231-predictor-69f59b6d96-7wdnc to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-75b0d-predictor-758f7456fd-b6kvp to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-3a238-predictor-644f78f4dc-brf2z to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-b868d-predictor-5b5b76b898-87zsk to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

model-chainer-bfd66c7dd-dqj6x

Scheduled

Successfully assigned kserve-ci-e2e-test/model-chainer-bfd66c7dd-dqj6x to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

switch-graph-d41b1-655d9c9d4c-sgp2v

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-d41b1-655d9c9d4c-sgp2v to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-d41b1-predictor-597b847c78-mknpz to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-vw4f4 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-12404-predictor-6cb5f67968-778ps

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-12404-predictor-6cb5f67968-778ps to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

sequence-graph-c1a35-5b75598d54-4l255

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-c1a35-5b75598d54-4l255 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

sequence-graph-64806-67786fdb9f-r75d4

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-64806-67786fdb9f-r75d4 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

switch-graph-b868d-787648c54b-r6sn7

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-b868d-787648c54b-r6sn7 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

switch-graph-98aac-579b965bb4-xcntc

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-98aac-579b965bb4-xcntc to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

success-200-isvc-f3231-predictor-788854545f-lsqvw

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-f3231-predictor-788854545f-lsqvw to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-98aac-predictor-5678f7cc74-45scf to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

ensemble-graph-75b0d-699b59b6f-snfrm

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-75b0d-699b59b6f-snfrm to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-c1a35-predictor-5b5885dd-sx2hh to ip-10-0-132-81.ec2.internal

kserve-ci-e2e-test

sequence-graph-3a238-f58988f4b-8hmrl

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-3a238-f58988f4b-8hmrl to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

splitter-graph-4efea-5bb974bbff-2dr89

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-4efea-5bb974bbff-2dr89 to ip-10-0-128-21.ec2.internal

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d41b1

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-d41b1": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-d41b1": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d41b1

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-d41b1": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-1-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-1-predictor-66d58bff49 from 0 to 1

kserve-ci-e2e-test

deployment-controller

success-200-isvc-d41b1-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-d41b1-predictor-6fd5c66657 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-d41b1-predictor-6fd5c66657

SuccessfulCreate

Created pod: success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

kserve-ci-e2e-test

deployment-controller

error-404-isvc-d41b1-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-d41b1-predictor-597b847c78 from 0 to 1

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-2-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-2-predictor-87d6c5875 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

isvc-xgboost-graph-predictor-669d8d6456

SuccessfulCreate

Created pod: isvc-xgboost-graph-predictor-669d8d6456-vw4f4

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-1-predictor-66d58bff49

SuccessfulCreate

Created pod: isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Pulling

Pulling image "quay.io/opendatahub/success-200-isvc:odh-pr-1447"

kserve-ci-e2e-test

multus

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

AddedInterface

Add eth0 [10.132.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:85f85c16414082de64de782f602228652c10c5bace08b518e803a7e911c62e5e"

kserve-ci-e2e-test

multus

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

AddedInterface

Add eth0 [10.132.0.28/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-d41b1-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "isvc-sklearn-graph-1-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-2-predictor-87d6c5875

SuccessfulCreate

Created pod: isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-d41b1-predictor-597b847c78

SuccessfulCreate

Created pod: error-404-isvc-d41b1-predictor-597b847c78-mknpz

kserve-ci-e2e-test

deployment-controller

isvc-xgboost-graph-predictor

ScalingReplicaSet

Scaled up replica set isvc-xgboost-graph-predictor-669d8d6456 from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:85f85c16414082de64de782f602228652c10c5bace08b518e803a7e911c62e5e"

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Pulling

Pulling image "quay.io/opendatahub/error-404-isvc:odh-pr-1447"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:85f85c16414082de64de782f602228652c10c5bace08b518e803a7e911c62e5e"

kserve-ci-e2e-test

multus

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

AddedInterface

Add eth0 [10.133.0.28/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

error-404-isvc-d41b1-predictor-597b847c78-mknpz

AddedInterface

Add eth0 [10.133.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

AddedInterface

Add eth0 [10.133.0.26/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:85f85c16414082de64de782f602228652c10c5bace08b518e803a7e911c62e5e" in 12.593s (12.593s including waiting). Image size: 299849144 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Pulled

Successfully pulled image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" in 12.921s (12.921s including waiting). Image size: 1335723117 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:85f85c16414082de64de782f602228652c10c5bace08b518e803a7e911c62e5e" in 12.108s (12.108s including waiting). Image size: 299849144 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:85f85c16414082de64de782f602228652c10c5bace08b518e803a7e911c62e5e" in 13.784s (13.784s including waiting). Image size: 299849144 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.487s (2.487s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Pulled

Successfully pulled image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" in 14.238s (14.238s including waiting). Image size: 1334715495 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Pulling

Pulling image "kserve/xgbserver:latest"

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.576s (2.576s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1447"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1447"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1447" in 6.415s (6.415s including waiting). Image size: 1560926130 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1447" in 6.393s (6.393s including waiting). Image size: 1560926130 bytes.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d41b1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d41b1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Pulled

Successfully pulled image "kserve/xgbserver:latest" in 19.329s (19.329s including waiting). Image size: 1306417402 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Started

Started container kube-rbac-proxy
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d41b1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d41b1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Unhealthy

Readiness probe failed: dial tcp 10.132.0.27:8080: connect: connection refused
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d41b1

InferenceServiceReady

InferenceService [success-200-isvc-d41b1] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d41b1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d41b1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d41b1

InferenceServiceReady

InferenceService [error-404-isvc-d41b1] is Ready

kserve-ci-e2e-test

InferenceGraphController

switch-graph-d41b1

InferenceGraphReady

InferenceGraph [switch-graph-d41b1] is Ready

kserve-ci-e2e-test

kubelet

switch-graph-d41b1-655d9c9d4c-sgp2v

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "switch-graph-d41b1-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

switch-graph-d41b1

ScalingReplicaSet

Scaled up replica set switch-graph-d41b1-655d9c9d4c from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-d41b1-655d9c9d4c

SuccessfulCreate

Created pod: switch-graph-d41b1-655d9c9d4c-sgp2v

kserve-ci-e2e-test

multus

switch-graph-d41b1-655d9c9d4c-sgp2v

AddedInterface

Add eth0 [10.132.0.29/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-d41b1-655d9c9d4c-sgp2v

Pulling

Pulling image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887"

kserve-ci-e2e-test

kubelet

switch-graph-d41b1-655d9c9d4c-sgp2v

Created

Created container: switch-graph-d41b1

kserve-ci-e2e-test

kubelet

switch-graph-d41b1-655d9c9d4c-sgp2v

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" in 1.995s (1.995s including waiting). Image size: 216340060 bytes.

kserve-ci-e2e-test

kubelet

switch-graph-d41b1-655d9c9d4c-sgp2v

Started

Started container switch-graph-d41b1
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d41b1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d41b1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d41b1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d41b1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

InferenceServiceReady

InferenceService [isvc-sklearn-graph-1] is Ready

kserve-ci-e2e-test

kubelet

switch-graph-d41b1-655d9c9d4c-sgp2v

Killing

Stopping container switch-graph-d41b1

kserve-ci-e2e-test

deployment-controller

success-200-isvc-b868d-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-b868d-predictor-6f6bd7f57 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Killing

Stopping container kserve-container
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

InferenceServiceReady

InferenceService [isvc-sklearn-graph-2] is Ready
(x10)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

error-404-isvc-b868d-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-b868d-predictor-5b5b76b898 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-b868d-predictor-5b5b76b898

SuccessfulCreate

Created pod: error-404-isvc-b868d-predictor-5b5b76b898-87zsk
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Unhealthy

Readiness probe failed: dial tcp 10.133.0.27:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-d41b1-predictor-597b847c78-mknpz

Unhealthy

Readiness probe failed: Get "https://10.133.0.27:8643/healthz": dial tcp 10.133.0.27:8643: connect: connection refused

kserve-ci-e2e-test

multus

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

AddedInterface

Add eth0 [10.132.0.30/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-b868d-predictor-6f6bd7f57

SuccessfulCreate

Created pod: success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b868d

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-b868d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b868d

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-b868d": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-b868d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-b868d-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b868d

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-b868d": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-b868d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b868d

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-b868d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

AddedInterface

Add eth0 [10.133.0.29/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-d41b1-predictor-6fd5c66657-8qwq2

Unhealthy

Readiness probe failed: Get "https://10.132.0.27:8643/healthz": context deadline exceeded

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

InferenceServiceReady

InferenceService [isvc-xgboost-graph] is Ready
(x10)

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-d41b1-655d9c9d4c-sgp2v

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

replicaset-controller

model-chainer-bfd66c7dd

SuccessfulCreate

Created pod: model-chainer-bfd66c7dd-dqj6x

kserve-ci-e2e-test

deployment-controller

model-chainer

ScalingReplicaSet

Scaled up replica set model-chainer-bfd66c7dd from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

model-chainer

InferenceGraphReady

InferenceGraph [model-chainer] is Ready

kserve-ci-e2e-test

kubelet

model-chainer-bfd66c7dd-dqj6x

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "model-chainer-serving-cert" not found
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

model-chainer-bfd66c7dd-dqj6x

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine
(x7)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x7)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

model-chainer-bfd66c7dd-dqj6x

Started

Started container model-chainer

kserve-ci-e2e-test

kubelet

model-chainer-bfd66c7dd-dqj6x

Created

Created container: model-chainer

kserve-ci-e2e-test

multus

model-chainer-bfd66c7dd-dqj6x

AddedInterface

Add eth0 [10.132.0.31/23] from ovn-kubernetes
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-3a238-predictor-644f78f4dc

SuccessfulCreate

Created pod: success-200-isvc-3a238-predictor-644f78f4dc-brf2z

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

model-chainer-bfd66c7dd-dqj6x

Killing

Stopping container model-chainer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

success-200-isvc-3a238-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-3a238-predictor-644f78f4dc from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-3a238-predictor-6d56d496bd

SuccessfulCreate

Created pod: error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-3a238

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-3a238": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-3a238": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-3a238-predictor-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

error-404-isvc-3a238-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-3a238-predictor-6d56d496bd from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-3a238

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-3a238": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-3a238-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

AddedInterface

Add eth0 [10.132.0.32/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

AddedInterface

Add eth0 [10.133.0.30/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Unhealthy

Readiness probe failed: Get "https://10.132.0.28:8643/healthz": dial tcp 10.132.0.28:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Unhealthy

Readiness probe failed: dial tcp 10.133.0.29:8080: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-vw4f4

Unhealthy

Readiness probe failed: dial tcp 10.132.0.28:8080: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Unhealthy

Readiness probe failed: dial tcp 10.133.0.26:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Unhealthy

Readiness probe failed: Get "https://10.133.0.28:8643/healthz": dial tcp 10.133.0.28:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-66d58bff49-kwvch

Unhealthy

Readiness probe failed: Get "https://10.133.0.26:8643/healthz": dial tcp 10.133.0.26:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-87d6c5875-x4cml

Unhealthy

Readiness probe failed: dial tcp 10.133.0.28:8080: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b868d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b868d-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b868d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b868d-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b868d

InferenceServiceReady

InferenceService [success-200-isvc-b868d] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b868d

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b868d

InferenceServiceReady

InferenceService [error-404-isvc-b868d] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b868d

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-3a238-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-3a238-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

switch-graph-b868d-787648c54b-r6sn7

Created

Created container: switch-graph-b868d

kserve-ci-e2e-test

InferenceGraphController

switch-graph-b868d

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-b868d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

switch-graph-b868d-787648c54b-r6sn7

Started

Started container switch-graph-b868d

kserve-ci-e2e-test

deployment-controller

switch-graph-b868d

ScalingReplicaSet

Scaled up replica set switch-graph-b868d-787648c54b from 0 to 1

kserve-ci-e2e-test

multus

switch-graph-b868d-787648c54b-r6sn7

AddedInterface

Add eth0 [10.132.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

switch-graph-b868d-787648c54b

SuccessfulCreate

Created pod: switch-graph-b868d-787648c54b-r6sn7

kserve-ci-e2e-test

kubelet

switch-graph-b868d-787648c54b-r6sn7

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine

kserve-ci-e2e-test

InferenceGraphController

switch-graph-b868d

InferenceGraphReady

InferenceGraph [switch-graph-b868d] is Ready

kserve-ci-e2e-test

InferenceGraphController

switch-graph-b868d

UpdateFailed

Failed to update status for InferenceGraph "switch-graph-b868d": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-b868d": the object has been modified; please apply your changes to the latest version and try again
(x6)

kserve-ci-e2e-test

kubelet

model-chainer-bfd66c7dd-dqj6x

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-3a238-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-3a238-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Unhealthy

Readiness probe failed: dial tcp 10.133.0.30:8080: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b868d-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b868d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-3a238

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-3a238

InferenceServiceReady

InferenceService [success-200-isvc-3a238] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-3a238

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-3a238

InferenceServiceReady

InferenceService [error-404-isvc-3a238] is Ready
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b868d-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b868d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

sequence-graph-3a238

ScalingReplicaSet

Scaled up replica set sequence-graph-3a238-f58988f4b from 0 to 1

kserve-ci-e2e-test

replicaset-controller

sequence-graph-3a238-f58988f4b

SuccessfulCreate

Created pod: sequence-graph-3a238-f58988f4b-8hmrl

kserve-ci-e2e-test

kubelet

sequence-graph-3a238-f58988f4b-8hmrl

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "sequence-graph-3a238-serving-cert" not found

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-3a238

InferenceGraphReady

InferenceGraph [sequence-graph-3a238] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-3a238-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-3a238-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

multus

sequence-graph-3a238-f58988f4b-8hmrl

AddedInterface

Add eth0 [10.132.0.34/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-3a238-f58988f4b-8hmrl

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine

kserve-ci-e2e-test

kubelet

sequence-graph-3a238-f58988f4b-8hmrl

Created

Created container: sequence-graph-3a238

kserve-ci-e2e-test

kubelet

sequence-graph-3a238-f58988f4b-8hmrl

Started

Started container sequence-graph-3a238
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-b868d

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-b868d

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-3a238-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-3a238-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-3a238

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-3a238

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

switch-graph-b868d-787648c54b-r6sn7

Killing

Stopping container switch-graph-b868d

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f3231

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-f3231": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f3231

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-f3231": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-f3231": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-f3231-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-f3231-predictor-788854545f from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-f3231-predictor-788854545f

SuccessfulCreate

Created pod: success-200-isvc-f3231-predictor-788854545f-lsqvw

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-f3231-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-f3231-predictor-788854545f-lsqvw

AddedInterface

Add eth0 [10.132.0.35/23] from ovn-kubernetes

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f3231

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-f3231": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f3231

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-f3231": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-f3231": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-f3231-predictor-69f59b6d96

SuccessfulCreate

Created pod: error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

deployment-controller

error-404-isvc-f3231-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-f3231-predictor-69f59b6d96 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

AddedInterface

Add eth0 [10.133.0.31/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Created

Created container: kube-rbac-proxy
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Unhealthy

Readiness probe failed: dial tcp 10.132.0.30:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-b868d-predictor-6f6bd7f57-hn5g8

Unhealthy

Readiness probe failed: Get "https://10.132.0.30:8643/healthz": dial tcp 10.132.0.30:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Unhealthy

Readiness probe failed: dial tcp 10.133.0.29:8080: i/o timeout

kserve-ci-e2e-test

kubelet

error-404-isvc-b868d-predictor-5b5b76b898-87zsk

Unhealthy

Readiness probe failed: Get "https://10.133.0.29:8643/healthz": context deadline exceeded
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-b868d-787648c54b-r6sn7

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-c1a35-predictor-869c6689b7

SuccessfulCreate

Created pod: success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

kserve-ci-e2e-test

kubelet

sequence-graph-3a238-f58988f4b-8hmrl

Killing

Stopping container sequence-graph-3a238

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-3a238-predictor-6d56d496bd-kgdhd

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c1a35

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-c1a35": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c1a35

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-c1a35": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-c1a35": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-c1a35-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-c1a35-predictor-5b5885dd from 0 to 1

kserve-ci-e2e-test

deployment-controller

success-200-isvc-c1a35-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-c1a35-predictor-869c6689b7 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-c1a35-predictor-serving-cert" not found

kserve-ci-e2e-test

multus

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

AddedInterface

Add eth0 [10.132.0.36/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-c1a35-predictor-5b5885dd

SuccessfulCreate

Created pod: error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-c1a35-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Created

Created container: kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

AddedInterface

Add eth0 [10.133.0.32/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Started

Started container kube-rbac-proxy
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Unhealthy

Readiness probe failed: dial tcp 10.132.0.35:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Unhealthy

Readiness probe failed: Get "https://10.132.0.32:8643/healthz": dial tcp 10.132.0.32:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-3a238-predictor-644f78f4dc-brf2z

Unhealthy

Readiness probe failed: dial tcp 10.132.0.32:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Unhealthy

Readiness probe failed: dial tcp 10.133.0.31:8080: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f3231-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f3231-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f3231-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f3231-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f3231

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f3231

InferenceServiceReady

InferenceService [success-200-isvc-f3231] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f3231

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f3231

InferenceServiceReady

InferenceService [error-404-isvc-f3231] is Ready

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c1a35-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c1a35-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

deployment-controller

ensemble-graph-f3231

ScalingReplicaSet

Scaled up replica set ensemble-graph-f3231-7b5968d9d6 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-f3231-7b5968d9d6

SuccessfulCreate

Created pod: ensemble-graph-f3231-7b5968d9d6-rbjc2

kserve-ci-e2e-test

kubelet

ensemble-graph-f3231-7b5968d9d6-rbjc2

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "ensemble-graph-f3231-serving-cert" not found

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-f3231

InferenceGraphReady

InferenceGraph [ensemble-graph-f3231] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f3231-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f3231-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f3231-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f3231-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

ensemble-graph-f3231-7b5968d9d6-rbjc2

Created

Created container: ensemble-graph-f3231

kserve-ci-e2e-test

kubelet

ensemble-graph-f3231-7b5968d9d6-rbjc2

Started

Started container ensemble-graph-f3231

kserve-ci-e2e-test

multus

ensemble-graph-f3231-7b5968d9d6-rbjc2

AddedInterface

Add eth0 [10.132.0.37/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

ensemble-graph-f3231-7b5968d9d6-rbjc2

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-3a238-f58988f4b-8hmrl

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c1a35-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c1a35-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-75b0d-predictor-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

error-404-isvc-75b0d-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-75b0d-predictor-758f7456fd from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

ensemble-graph-f3231-7b5968d9d6-rbjc2

Killing

Stopping container ensemble-graph-f3231

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-75b0d-predictor-758f7456fd

SuccessfulCreate

Created pod: error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-75b0d

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-75b0d": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-75b0d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Created

Created container: kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-75b0d

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-75b0d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-75b0d

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-75b0d": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-75b0d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-75b0d-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-75b0d-predictor-9549449fc from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-75b0d-predictor-9549449fc

SuccessfulCreate

Created pod: success-200-isvc-75b0d-predictor-9549449fc-92fhr

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-75b0d

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-75b0d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

AddedInterface

Add eth0 [10.133.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-f3231-predictor-69f59b6d96-7wdnc

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-75b0d-predictor-9549449fc-92fhr

AddedInterface

Add eth0 [10.132.0.38/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f3231-predictor-788854545f-lsqvw

Unhealthy

Readiness probe failed: Get "https://10.132.0.35:8643/healthz": dial tcp 10.132.0.35:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Unhealthy

Readiness probe failed: dial tcp 10.132.0.36:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Unhealthy

Readiness probe failed: dial tcp 10.133.0.32:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c1a35

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c1a35

InferenceServiceReady

InferenceService [success-200-isvc-c1a35] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c1a35

InferenceServiceReady

InferenceService [error-404-isvc-c1a35] is Ready
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c1a35

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

deployment-controller

sequence-graph-c1a35

ScalingReplicaSet

Scaled up replica set sequence-graph-c1a35-5b75598d54 from 0 to 1
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c1a35-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

sequence-graph-c1a35-5b75598d54

SuccessfulCreate

Created pod: sequence-graph-c1a35-5b75598d54-4l255

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-c1a35

InferenceGraphReady

InferenceGraph [sequence-graph-c1a35] is Ready
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c1a35-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

sequence-graph-c1a35-5b75598d54-4l255

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine

kserve-ci-e2e-test

multus

sequence-graph-c1a35-5b75598d54-4l255

AddedInterface

Add eth0 [10.132.0.39/23] from ovn-kubernetes
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c1a35-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

sequence-graph-c1a35-5b75598d54-4l255

Created

Created container: sequence-graph-c1a35

kserve-ci-e2e-test

kubelet

sequence-graph-c1a35-5b75598d54-4l255

Started

Started container sequence-graph-c1a35
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c1a35-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-f3231-7b5968d9d6-rbjc2

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-75b0d-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-75b0d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

sequence-graph-c1a35-5b75598d54-4l255

Killing

Stopping container sequence-graph-c1a35

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-64806

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-64806": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-64806

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-64806": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-64806": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-64806-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-64806-predictor-6bdc69944f from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-64806-predictor-6bdc69944f

SuccessfulCreate

Created pod: success-200-isvc-64806-predictor-6bdc69944f-rmdz9

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-64806

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-64806": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

AddedInterface

Add eth0 [10.132.0.40/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-64806

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-64806": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-64806": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

AddedInterface

Add eth0 [10.133.0.34/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

error-404-isvc-64806-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-64806-predictor-5cd95b49d6 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-64806-predictor-5cd95b49d6

SuccessfulCreate

Created pod: error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c1a35-predictor-869c6689b7-jfwbk

Killing

Stopping container kube-rbac-proxy
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Unhealthy

Readiness probe failed: dial tcp 10.133.0.33:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-c1a35-predictor-5b5885dd-sx2hh

Unhealthy

Readiness probe failed: Get "https://10.133.0.32:8643/healthz": dial tcp 10.133.0.32:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-75b0d

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-75b0d

InferenceServiceReady

InferenceService [success-200-isvc-75b0d] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-75b0d

InferenceServiceReady

InferenceService [error-404-isvc-75b0d] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-75b0d

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-64806-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-64806-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

ensemble-graph-75b0d-699b59b6f-snfrm

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine

kserve-ci-e2e-test

multus

ensemble-graph-75b0d-699b59b6f-snfrm

AddedInterface

Add eth0 [10.132.0.41/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-75b0d-699b59b6f

SuccessfulCreate

Created pod: ensemble-graph-75b0d-699b59b6f-snfrm

kserve-ci-e2e-test

deployment-controller

ensemble-graph-75b0d

ScalingReplicaSet

Scaled up replica set ensemble-graph-75b0d-699b59b6f from 0 to 1

kserve-ci-e2e-test

kubelet

ensemble-graph-75b0d-699b59b6f-snfrm

Created

Created container: ensemble-graph-75b0d

kserve-ci-e2e-test

kubelet

ensemble-graph-75b0d-699b59b6f-snfrm

Started

Started container ensemble-graph-75b0d
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-75b0d-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-75b0d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-75b0d

InferenceGraphReady

InferenceGraph [ensemble-graph-75b0d] is Ready
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-c1a35-5b75598d54-4l255

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-64806-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-64806-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-75b0d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-75b0d-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-75b0d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-75b0d-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-64806

InferenceServiceReady

InferenceService [error-404-isvc-64806] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-64806

InferenceServiceReady

InferenceService [success-200-isvc-64806] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-64806

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-64806

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

deployment-controller

sequence-graph-64806

ScalingReplicaSet

Scaled up replica set sequence-graph-64806-67786fdb9f from 0 to 1

kserve-ci-e2e-test

replicaset-controller

sequence-graph-64806-67786fdb9f

SuccessfulCreate

Created pod: sequence-graph-64806-67786fdb9f-r75d4

kserve-ci-e2e-test

kubelet

sequence-graph-64806-67786fdb9f-r75d4

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "sequence-graph-64806-serving-cert" not found

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-64806

InferenceGraphReady

InferenceGraph [sequence-graph-64806] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-64806-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-64806-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

sequence-graph-64806-67786fdb9f-r75d4

Created

Created container: sequence-graph-64806

kserve-ci-e2e-test

kubelet

sequence-graph-64806-67786fdb9f-r75d4

Started

Started container sequence-graph-64806

kserve-ci-e2e-test

kubelet

sequence-graph-64806-67786fdb9f-r75d4

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine

kserve-ci-e2e-test

multus

sequence-graph-64806-67786fdb9f-r75d4

AddedInterface

Add eth0 [10.132.0.42/23] from ovn-kubernetes
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-75b0d

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-75b0d

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-64806-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-64806-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-64806

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-64806

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

ensemble-graph-75b0d-699b59b6f-snfrm

Killing

Stopping container ensemble-graph-75b0d

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-4efea

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-4efea": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Killing

Stopping container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

AddedInterface

Add eth0 [10.133.0.35/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-4efea-predictor-6bd955dcc9

SuccessfulCreate

Created pod: success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-4efea-predictor-5fb967fbbd

SuccessfulCreate

Created pod: error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-4efea

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-4efea": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-4efea": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-75b0d-predictor-758f7456fd-b6kvp

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-4efea

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-4efea": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-4efea

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-4efea": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-4efea": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

AddedInterface

Add eth0 [10.132.0.43/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

success-200-isvc-4efea-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-4efea-predictor-6bd955dcc9 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Started

Started container kserve-container

kserve-ci-e2e-test

deployment-controller

error-404-isvc-4efea-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-4efea-predictor-5fb967fbbd from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Unhealthy

Readiness probe failed: Get "https://10.132.0.38:8643/healthz": dial tcp 10.132.0.38:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-75b0d-predictor-9549449fc-92fhr

Unhealthy

Readiness probe failed: dial tcp 10.132.0.38:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-75b0d-699b59b6f-snfrm

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

sequence-graph-64806-67786fdb9f-r75d4

Killing

Stopping container sequence-graph-64806

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-98aac

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-98aac": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-98aac

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-98aac": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-98aac": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

success-200-isvc-98aac-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-98aac-predictor-7df4fb9989 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-98aac-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-98aac-predictor-7df4fb9989

SuccessfulCreate

Created pod: success-200-isvc-98aac-predictor-7df4fb9989-xwljl

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-98aac-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-98aac

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-98aac": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-98aac": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-98aac

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-98aac": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

error-404-isvc-98aac-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-98aac-predictor-5678f7cc74 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-98aac-predictor-5678f7cc74

SuccessfulCreate

Created pod: error-404-isvc-98aac-predictor-5678f7cc74-45scf

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-98aac-predictor-5678f7cc74-45scf

AddedInterface

Add eth0 [10.133.0.36/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

AddedInterface

Add eth0 [10.132.0.44/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Created

Created container: kserve-container
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Unhealthy

Readiness probe failed: dial tcp 10.132.0.43:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Unhealthy

Readiness probe failed: dial tcp 10.133.0.35:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Unhealthy

Readiness probe failed: dial tcp 10.132.0.40:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-64806-predictor-6bdc69944f-rmdz9

Unhealthy

Readiness probe failed: Get "https://10.132.0.40:8643/healthz": dial tcp 10.132.0.40:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Unhealthy

Readiness probe failed: dial tcp 10.133.0.34:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-64806-predictor-5cd95b49d6-g7v8z

Unhealthy

Readiness probe failed: Get "https://10.133.0.34:8643/healthz": dial tcp 10.133.0.34:8643: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-4efea-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-4efea-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-4efea-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-4efea-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-4efea

InferenceServiceReady

InferenceService [success-200-isvc-4efea] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-4efea

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-4efea

InferenceServiceReady

InferenceService [error-404-isvc-4efea] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-4efea

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-4efea-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-4efea-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-4efea

InferenceGraphReady

InferenceGraph [splitter-graph-4efea] is Ready

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-4efea-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

splitter-graph-4efea

ScalingReplicaSet

Scaled up replica set splitter-graph-4efea-5bb974bbff from 0 to 1

kserve-ci-e2e-test

replicaset-controller

splitter-graph-4efea-5bb974bbff

SuccessfulCreate

Created pod: splitter-graph-4efea-5bb974bbff-2dr89

kserve-ci-e2e-test

multus

splitter-graph-4efea-5bb974bbff-2dr89

AddedInterface

Add eth0 [10.132.0.45/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

splitter-graph-4efea-5bb974bbff-2dr89

Started

Started container splitter-graph-4efea

kserve-ci-e2e-test

kubelet

splitter-graph-4efea-5bb974bbff-2dr89

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-4efea-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-4efea-5bb974bbff-2dr89

Created

Created container: splitter-graph-4efea
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-64806-67786fdb9f-r75d4

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-98aac-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-98aac-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Created

Created container: kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-12404-predictor-6cb5f67968-778ps

AddedInterface

Add eth0 [10.133.0.37/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

error-404-isvc-12404-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-12404-predictor-6cb5f67968 from 0 to 1

kserve-ci-e2e-test

kubelet

splitter-graph-4efea-5bb974bbff-2dr89

Killing

Stopping container splitter-graph-4efea

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

success-200-isvc-12404-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-12404-predictor-564cf4d979 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-12404-predictor-564cf4d979

SuccessfulCreate

Created pod: success-200-isvc-12404-predictor-564cf4d979-76fsj

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-12404-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-12404-predictor-6cb5f67968

SuccessfulCreate

Created pod: error-404-isvc-12404-predictor-6cb5f67968-778ps

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1447" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-12404-predictor-564cf4d979-76fsj

AddedInterface

Add eth0 [10.132.0.46/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-4efea-predictor-6bd955dcc9-9tf9f

Unhealthy

Readiness probe failed: Get "https://10.132.0.43:8643/healthz": dial tcp 10.132.0.43:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-4efea-predictor-5fb967fbbd-hrpx8

Unhealthy

Readiness probe failed: Get "https://10.133.0.35:8643/healthz": dial tcp 10.133.0.35:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Unhealthy

Readiness probe failed: dial tcp 10.133.0.36:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Unhealthy

Readiness probe failed: dial tcp 10.132.0.44:8080: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-98aac

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-98aac

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-98aac

InferenceServiceReady

InferenceService [error-404-isvc-98aac] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-98aac

InferenceServiceReady

InferenceService [success-200-isvc-98aac] is Ready

kserve-ci-e2e-test

InferenceGraphController

switch-graph-98aac

UpdateFailed

Failed to update status for InferenceGraph "switch-graph-98aac": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-98aac": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

switch-graph-98aac

InferenceGraphReady

InferenceGraph [switch-graph-98aac] is Ready

kserve-ci-e2e-test

deployment-controller

switch-graph-98aac

ScalingReplicaSet

Scaled up replica set switch-graph-98aac-579b965bb4 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-98aac-579b965bb4

SuccessfulCreate

Created pod: switch-graph-98aac-579b965bb4-xcntc

kserve-ci-e2e-test

kubelet

switch-graph-98aac-579b965bb4-xcntc

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "switch-graph-98aac-serving-cert" not found

kserve-ci-e2e-test

InferenceGraphController

switch-graph-98aac

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-98aac": the object has been modified; please apply your changes to the latest version and try again
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-98aac-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-98aac-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-4efea-5bb974bbff-2dr89

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

switch-graph-98aac-579b965bb4-xcntc

Created

Created container: switch-graph-98aac

kserve-ci-e2e-test

multus

switch-graph-98aac-579b965bb4-xcntc

AddedInterface

Add eth0 [10.132.0.47/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-98aac-579b965bb4-xcntc

Started

Started container switch-graph-98aac

kserve-ci-e2e-test

kubelet

switch-graph-98aac-579b965bb4-xcntc

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-12404-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-12404-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-12404-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-12404-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Unhealthy

Readiness probe failed: dial tcp 10.133.0.37:8080: connect: connection refused

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-98aac-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-98aac-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-98aac-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-98aac-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-12404

InferenceServiceReady

InferenceService [success-200-isvc-12404] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-12404

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-12404

InferenceServiceReady

InferenceService [error-404-isvc-12404] is Ready
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-12404

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-98aac

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-98aac

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-12404

InferenceGraphReady

InferenceGraph [splitter-graph-12404] is Ready

kserve-ci-e2e-test

kubelet

splitter-graph-12404-554567d986-qpd8n

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f2021b9f79dbd151723ae4c386dd28bedd0fff23f6daa0825bdc58598fc71887" already present on machine

kserve-ci-e2e-test

multus

splitter-graph-12404-554567d986-qpd8n

AddedInterface

Add eth0 [10.132.0.48/23] from ovn-kubernetes
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-12404-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

splitter-graph-12404

ScalingReplicaSet

Scaled up replica set splitter-graph-12404-554567d986 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

splitter-graph-12404-554567d986

SuccessfulCreate

Created pod: splitter-graph-12404-554567d986-qpd8n
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-12404-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-12404-554567d986-qpd8n

Created

Created container: splitter-graph-12404

kserve-ci-e2e-test

kubelet

splitter-graph-12404-554567d986-qpd8n

Started

Started container splitter-graph-12404
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-12404-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-12404-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-12404

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-12404

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-12404-predictor-6cb5f67968-778ps

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

splitter-graph-12404-554567d986-qpd8n

Killing

Stopping container splitter-graph-12404

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Killing

Stopping container kserve-container
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Unhealthy

Readiness probe failed: dial tcp 10.132.0.46:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-12404-predictor-564cf4d979-76fsj

Unhealthy

Readiness probe failed: Get "https://10.132.0.46:8643/healthz": dial tcp 10.132.0.46:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-12404-554567d986-qpd8n

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-98aac-predictor-7df4fb9989-xwljl

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

switch-graph-98aac-579b965bb4-xcntc

Killing

Stopping container switch-graph-98aac

kserve-ci-e2e-test

kubelet

error-404-isvc-98aac-predictor-5678f7cc74-45scf

Killing

Stopping container kserve-container
(x5)

kserve-ci-e2e-test

kubelet

switch-graph-98aac-579b965bb4-xcntc

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503