Time Namespace Component RelatedObject Reason Message

kserve-ci-e2e-test

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

switch-graph-127f1-5bd7f94589-flj29

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29 to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6 to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

splitter-graph-ab6b8-54bf769bd5-jnmp5

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5 to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

splitter-graph-57557-86766cd968-2dzqj

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

sequence-graph-e6f34-55f69b6976-9c549

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549 to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

switch-graph-9290e-8cc455c84-hqz47

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47 to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

switch-graph-60a5c-64b647654d-2knr4

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4 to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

ensemble-graph-7541f-6b5b9965d4-lj2ph

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

ensemble-graph-38d71-7dbc85bf6b-hh2kv

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9 to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

sequence-graph-fe410-7759986577-6jbfq

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

model-chainer-76487779f8-46tnx

Scheduled

Successfully assigned kserve-ci-e2e-test/model-chainer-76487779f8-46tnx to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29 to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

sequence-graph-63de8-85847f4f4d-lglk2

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2 to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf to ip-10-0-131-234.ec2.internal

kserve-ci-e2e-test

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n to ip-10-0-140-19.ec2.internal

kserve-ci-e2e-test

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8 to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw to ip-10-0-130-227.ec2.internal

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-1-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-1-predictor-5b497dcd98 from 0 to 1

kserve-ci-e2e-test

multus

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

AddedInterface

Add eth0 [10.132.0.17/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-2-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-2-predictor-847f6cf74b from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-127f1-predictor-548d5f46c9

SuccessfulCreate

Created pod: error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

kserve-ci-e2e-test

deployment-controller

error-404-isvc-127f1-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-127f1-predictor-548d5f46c9 from 0 to 1

kserve-ci-e2e-test

deployment-controller

isvc-xgboost-graph-predictor

ScalingReplicaSet

Scaled up replica set isvc-xgboost-graph-predictor-669d8d6456 from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "isvc-xgboost-graph-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

isvc-xgboost-graph-predictor-669d8d6456

SuccessfulCreate

Created pod: isvc-xgboost-graph-predictor-669d8d6456-52hdf

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0027065b33c40f52a3d9fc655cbe89bc4c54c77f7f755275def0561338196f29"

kserve-ci-e2e-test

deployment-controller

success-200-isvc-127f1-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-127f1-predictor-68b975dccb from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-127f1-predictor-68b975dccb

SuccessfulCreate

Created pod: success-200-isvc-127f1-predictor-68b975dccb-dcphk

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-127f1-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-1-predictor-5b497dcd98

SuccessfulCreate

Created pod: isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0027065b33c40f52a3d9fc655cbe89bc4c54c77f7f755275def0561338196f29"

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-2-predictor-847f6cf74b

SuccessfulCreate

Created pod: isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

kserve-ci-e2e-test

multus

isvc-xgboost-graph-predictor-669d8d6456-52hdf

AddedInterface

Add eth0 [10.134.0.24/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

success-200-isvc-127f1-predictor-68b975dccb-dcphk

AddedInterface

Add eth0 [10.132.0.18/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Pulling

Pulling image "quay.io/opendatahub/error-404-isvc:odh-pr-1293"

kserve-ci-e2e-test

multus

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

AddedInterface

Add eth0 [10.133.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

AddedInterface

Add eth0 [10.132.0.19/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0027065b33c40f52a3d9fc655cbe89bc4c54c77f7f755275def0561338196f29"

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Pulling

Pulling image "quay.io/opendatahub/success-200-isvc:odh-pr-1293"

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0027065b33c40f52a3d9fc655cbe89bc4c54c77f7f755275def0561338196f29" in 3.741s (3.741s including waiting). Image size: 300951977 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Pulling

Pulling image "kserve/xgbserver:latest"

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Pulled

Successfully pulled image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" in 13.776s (13.776s including waiting). Image size: 1332215401 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0027065b33c40f52a3d9fc655cbe89bc4c54c77f7f755275def0561338196f29" in 13.269s (13.269s including waiting). Image size: 300951977 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Pulled

Successfully pulled image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" in 13.431s (13.431s including waiting). Image size: 1332277345 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0027065b33c40f52a3d9fc655cbe89bc4c54c77f7f755275def0561338196f29" in 14.359s (14.359s including waiting). Image size: 300951977 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.408s (2.409s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.9s (2.9s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293" in 7.491s (7.491s including waiting). Image size: 1560610732 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293" in 7.503s (7.503s including waiting). Image size: 1560610732 bytes.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-127f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-127f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Pulled

Successfully pulled image "kserve/xgbserver:latest" in 26.11s (26.11s including waiting). Image size: 1306414331 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.176s (2.176s including waiting). Image size: 211946088 bytes.
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-127f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-127f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Unhealthy

Readiness probe failed: dial tcp 10.133.0.33:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Unhealthy

Readiness probe failed: dial tcp 10.132.0.18:8080: connect: connection refused
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-127f1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-127f1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-127f1

InferenceServiceReady

InferenceService [success-200-isvc-127f1] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-127f1

InferenceServiceReady

InferenceService [error-404-isvc-127f1] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-127f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-127f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

switch-graph-127f1

ScalingReplicaSet

Scaled up replica set switch-graph-127f1-5bd7f94589 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-127f1-5bd7f94589

SuccessfulCreate

Created pod: switch-graph-127f1-5bd7f94589-flj29

kserve-ci-e2e-test

InferenceGraphController

switch-graph-127f1

InferenceGraphReady

InferenceGraph [switch-graph-127f1] is Ready

kserve-ci-e2e-test

kubelet

switch-graph-127f1-5bd7f94589-flj29

Pulling

Pulling image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f"

kserve-ci-e2e-test

multus

switch-graph-127f1-5bd7f94589-flj29

AddedInterface

Add eth0 [10.134.0.25/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-127f1-5bd7f94589-flj29

Started

Started container switch-graph-127f1

kserve-ci-e2e-test

kubelet

switch-graph-127f1-5bd7f94589-flj29

Created

Created container: switch-graph-127f1

kserve-ci-e2e-test

kubelet

switch-graph-127f1-5bd7f94589-flj29

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" in 2.003s (2.003s including waiting). Image size: 216233553 bytes.
(x7)

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Unhealthy

Readiness probe failed: dial tcp 10.134.0.24:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

switch-graph-127f1-5bd7f94589-flj29

Killing

Stopping container switch-graph-127f1

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-9290e-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-9290e-predictor-df657ff87

SuccessfulCreate

Created pod: error-404-isvc-9290e-predictor-df657ff87-2cvh9

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-9290e-predictor-5b44755bcb

SuccessfulCreate

Created pod: success-200-isvc-9290e-predictor-5b44755bcb-smm7n

kserve-ci-e2e-test

deployment-controller

success-200-isvc-9290e-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-9290e-predictor-5b44755bcb from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9290e

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-9290e": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-9290e": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9290e

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-9290e": the object has been modified; please apply your changes to the latest version and try again
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-127f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-127f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-9290e-predictor-df657ff87-2cvh9

AddedInterface

Add eth0 [10.133.0.34/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-127f1-predictor-68b975dccb-dcphk

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

error-404-isvc-9290e-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-9290e-predictor-df657ff87 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-127f1-predictor-548d5f46c9-cb6dd

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

AddedInterface

Add eth0 [10.132.0.20/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

InferenceServiceReady

InferenceService [isvc-sklearn-graph-2] is Ready

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

InferenceServiceReady

InferenceService [isvc-sklearn-graph-1] is Ready
(x10)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

InferenceServiceReady

InferenceService [isvc-xgboost-graph] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-127f1-5bd7f94589-flj29

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

deployment-controller

model-chainer

ScalingReplicaSet

Scaled up replica set model-chainer-76487779f8 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

model-chainer-76487779f8

SuccessfulCreate

Created pod: model-chainer-76487779f8-46tnx
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

model-chainer

InferenceGraphReady

InferenceGraph [model-chainer] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9290e-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9290e-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9290e-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9290e-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

model-chainer-76487779f8-46tnx

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "model-chainer-serving-cert" not found

kserve-ci-e2e-test

multus

model-chainer-76487779f8-46tnx

AddedInterface

Add eth0 [10.134.0.26/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

model-chainer-76487779f8-46tnx

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

kubelet

model-chainer-76487779f8-46tnx

Started

Started container model-chainer

kserve-ci-e2e-test

kubelet

model-chainer-76487779f8-46tnx

Created

Created container: model-chainer
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Unhealthy

Readiness probe failed: dial tcp 10.132.0.20:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-e6f34-predictor-57474bf446-r75x6

AddedInterface

Add eth0 [10.132.0.21/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-e6f34-predictor-57474bf446

SuccessfulCreate

Created pod: success-200-isvc-e6f34-predictor-57474bf446-r75x6

kserve-ci-e2e-test

deployment-controller

error-404-isvc-e6f34-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-e6f34-predictor-595bdccbdf from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-e6f34-predictor-595bdccbdf

SuccessfulCreate

Created pod: error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-52hdf

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

model-chainer-76487779f8-46tnx

Killing

Stopping container model-chainer

kserve-ci-e2e-test

deployment-controller

success-200-isvc-e6f34-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-e6f34-predictor-57474bf446 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

AddedInterface

Add eth0 [10.133.0.35/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Unhealthy

Readiness probe failed: Get "https://10.132.0.17:8643/healthz": dial tcp 10.132.0.17:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Unhealthy

Readiness probe failed: Get "https://10.132.0.19:8643/healthz": dial tcp 10.132.0.19:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29

Unhealthy

Readiness probe failed: dial tcp 10.132.0.17:8080: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz

Unhealthy

Readiness probe failed: dial tcp 10.132.0.19:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-9290e

InferenceServiceReady

InferenceService [error-404-isvc-9290e] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-9290e

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9290e

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9290e

InferenceServiceReady

InferenceService [success-200-isvc-9290e] is Ready

kserve-ci-e2e-test

InferenceGraphController

switch-graph-9290e

InferenceGraphReady

InferenceGraph [switch-graph-9290e] is Ready

kserve-ci-e2e-test

deployment-controller

switch-graph-9290e

ScalingReplicaSet

Scaled up replica set switch-graph-9290e-8cc455c84 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-9290e-8cc455c84

SuccessfulCreate

Created pod: switch-graph-9290e-8cc455c84-hqz47

kserve-ci-e2e-test

kubelet

switch-graph-9290e-8cc455c84-hqz47

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "switch-graph-9290e-serving-cert" not found

kserve-ci-e2e-test

kubelet

switch-graph-9290e-8cc455c84-hqz47

Created

Created container: switch-graph-9290e

kserve-ci-e2e-test

kubelet

switch-graph-9290e-8cc455c84-hqz47

Started

Started container switch-graph-9290e

kserve-ci-e2e-test

multus

switch-graph-9290e-8cc455c84-hqz47

AddedInterface

Add eth0 [10.134.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-9290e-8cc455c84-hqz47

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine
(x6)

kserve-ci-e2e-test

kubelet

model-chainer-76487779f8-46tnx

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-e6f34-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-e6f34-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-e6f34-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-e6f34-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9290e-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9290e-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9290e-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9290e-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Unhealthy

Readiness probe failed: dial tcp 10.132.0.21:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-e6f34

InferenceServiceReady

InferenceService [error-404-isvc-e6f34] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-e6f34

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-e6f34

InferenceServiceReady

InferenceService [success-200-isvc-e6f34] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-e6f34

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

deployment-controller

sequence-graph-e6f34

ScalingReplicaSet

Scaled up replica set sequence-graph-e6f34-55f69b6976 from 0 to 1

kserve-ci-e2e-test

kubelet

sequence-graph-e6f34-55f69b6976-9c549

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-e6f34-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-e6f34-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-e6f34

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "sequence-graph-e6f34": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-e6f34

UpdateFailed

Failed to update status for InferenceGraph "sequence-graph-e6f34": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "sequence-graph-e6f34": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-e6f34

InferenceGraphReady

InferenceGraph [sequence-graph-e6f34] is Ready

kserve-ci-e2e-test

multus

sequence-graph-e6f34-55f69b6976-9c549

AddedInterface

Add eth0 [10.134.0.28/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

sequence-graph-e6f34-55f69b6976

SuccessfulCreate

Created pod: sequence-graph-e6f34-55f69b6976-9c549

kserve-ci-e2e-test

kubelet

sequence-graph-e6f34-55f69b6976-9c549

Started

Started container sequence-graph-e6f34

kserve-ci-e2e-test

kubelet

sequence-graph-e6f34-55f69b6976-9c549

Created

Created container: sequence-graph-e6f34
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-9290e

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-9290e

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-e6f34

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-e6f34

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-e6f34

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-e6f34-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-e6f34

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-e6f34-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-7541f-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-9290e-predictor-5b44755bcb-smm7n

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7541f

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-7541f": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7541f

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-7541f": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-7541f": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

AddedInterface

Add eth0 [10.133.0.36/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-7541f-predictor-6b4d6f7547

SuccessfulCreate

Created pod: error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

kserve-ci-e2e-test

deployment-controller

error-404-isvc-7541f-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-7541f-predictor-6b4d6f7547 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7541f

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-7541f": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-7541f": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7541f

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-7541f": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

success-200-isvc-7541f-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-7541f-predictor-85fdf68876 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-7541f-predictor-85fdf68876

SuccessfulCreate

Created pod: success-200-isvc-7541f-predictor-85fdf68876-fwrsn

kserve-ci-e2e-test

kubelet

switch-graph-9290e-8cc455c84-hqz47

Killing

Stopping container switch-graph-9290e

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Started

Started container kserve-container

kserve-ci-e2e-test

multus

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

AddedInterface

Add eth0 [10.132.0.22/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Created

Created container: kserve-container
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Unhealthy

Readiness probe failed: dial tcp 10.133.0.34:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-9290e-predictor-df657ff87-2cvh9

Unhealthy

Readiness probe failed: Get "https://10.133.0.34:8643/healthz": dial tcp 10.133.0.34:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-9290e-8cc455c84-hqz47

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7541f-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7541f-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7541f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7541f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Unhealthy

Readiness probe failed: dial tcp 10.133.0.36:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

sequence-graph-e6f34-55f69b6976-9c549

Killing

Stopping container sequence-graph-e6f34
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Unhealthy

Readiness probe failed: dial tcp 10.132.0.22:8080: connect: connection refused

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-63de8-predictor-544568fdd5

SuccessfulCreate

Created pod: error-404-isvc-63de8-predictor-544568fdd5-wfwsj

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-63de8

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-63de8": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-63de8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-63de8

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-63de8": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-63de8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-63de8

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-63de8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Killing

Stopping container kube-rbac-proxy
(x2)

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-63de8-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-63de8-predictor-7d6b9c5fd

SuccessfulCreate

Created pod: success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-e6f34-predictor-57474bf446-r75x6

Killing

Stopping container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

AddedInterface

Add eth0 [10.133.0.37/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

error-404-isvc-63de8-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-63de8-predictor-544568fdd5 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

success-200-isvc-63de8-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-63de8-predictor-7d6b9c5fd from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Created

Created container: kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-63de8

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-63de8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

AddedInterface

Add eth0 [10.132.0.23/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Unhealthy

Readiness probe failed: Get "https://10.133.0.35:8643/healthz": dial tcp 10.133.0.35:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw

Unhealthy

Readiness probe failed: dial tcp 10.133.0.35:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7541f

InferenceServiceReady

InferenceService [success-200-isvc-7541f] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7541f

InferenceServiceReady

InferenceService [error-404-isvc-7541f] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7541f

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7541f

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-7541f

InferenceGraphReady

InferenceGraph [ensemble-graph-7541f] is Ready

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-7541f-6b5b9965d4

SuccessfulCreate

Created pod: ensemble-graph-7541f-6b5b9965d4-lj2ph

kserve-ci-e2e-test

deployment-controller

ensemble-graph-7541f

ScalingReplicaSet

Scaled up replica set ensemble-graph-7541f-6b5b9965d4 from 0 to 1
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7541f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7541f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7541f-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7541f-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

ensemble-graph-7541f-6b5b9965d4-lj2ph

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "ensemble-graph-7541f-serving-cert" not found

kserve-ci-e2e-test

kubelet

ensemble-graph-7541f-6b5b9965d4-lj2ph

Started

Started container ensemble-graph-7541f

kserve-ci-e2e-test

kubelet

ensemble-graph-7541f-6b5b9965d4-lj2ph

Created

Created container: ensemble-graph-7541f

kserve-ci-e2e-test

kubelet

ensemble-graph-7541f-6b5b9965d4-lj2ph

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

multus

ensemble-graph-7541f-6b5b9965d4-lj2ph

AddedInterface

Add eth0 [10.134.0.29/23] from ovn-kubernetes
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-e6f34-55f69b6976-9c549

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-63de8-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-38d71

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-38d71": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-38d71": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

ensemble-graph-7541f-6b5b9965d4-lj2ph

Killing

Stopping container ensemble-graph-7541f

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-38d71-predictor-5cd74bff75

SuccessfulCreate

Created pod: error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

kserve-ci-e2e-test

deployment-controller

error-404-isvc-38d71-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-38d71-predictor-5cd74bff75 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-7541f-predictor-85fdf68876-fwrsn

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-38d71-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-38d71-predictor-c6f86d4b4

SuccessfulCreate

Created pod: success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

kserve-ci-e2e-test

deployment-controller

success-200-isvc-38d71-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-38d71-predictor-c6f86d4b4 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-38d71

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-38d71": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-38d71": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-38d71

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-38d71": the object has been modified; please apply your changes to the latest version and try again
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-63de8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-38d71

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-38d71": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

AddedInterface

Add eth0 [10.132.0.24/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

AddedInterface

Add eth0 [10.133.0.38/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7541f-predictor-6b4d6f7547-62k8s

Unhealthy

Readiness probe failed: Get "https://10.133.0.36:8643/healthz": dial tcp 10.133.0.36:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Unhealthy

Readiness probe failed: dial tcp 10.133.0.37:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Unhealthy

Readiness probe failed: dial tcp 10.132.0.23:8080: connect: connection refused

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-38d71-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-38d71-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-63de8-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-63de8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-63de8

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-63de8

InferenceServiceReady

InferenceService [error-404-isvc-63de8] is Ready
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-63de8

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-63de8

InferenceServiceReady

InferenceService [success-200-isvc-63de8] is Ready
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-7541f-6b5b9965d4-lj2ph

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-38d71-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

sequence-graph-63de8-85847f4f4d-lglk2

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "sequence-graph-63de8-serving-cert" not found

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-63de8-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-63de8

InferenceGraphReady

InferenceGraph [sequence-graph-63de8] is Ready

kserve-ci-e2e-test

deployment-controller

sequence-graph-63de8

ScalingReplicaSet

Scaled up replica set sequence-graph-63de8-85847f4f4d from 0 to 1

kserve-ci-e2e-test

replicaset-controller

sequence-graph-63de8-85847f4f4d

SuccessfulCreate

Created pod: sequence-graph-63de8-85847f4f4d-lglk2
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-63de8-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-63de8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-38d71-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-63de8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

multus

sequence-graph-63de8-85847f4f4d-lglk2

AddedInterface

Add eth0 [10.134.0.30/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-63de8-85847f4f4d-lglk2

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

kubelet

sequence-graph-63de8-85847f4f4d-lglk2

Created

Created container: sequence-graph-63de8

kserve-ci-e2e-test

kubelet

sequence-graph-63de8-85847f4f4d-lglk2

Started

Started container sequence-graph-63de8
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Unhealthy

Readiness probe failed: dial tcp 10.132.0.24:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fe410

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-fe410": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-fe410": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fe410

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-fe410": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Created

Created container: kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fe410

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-fe410": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fe410

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-fe410": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-fe410": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-fe410-predictor-66c5568bb

SuccessfulCreate

Created pod: success-200-isvc-fe410-predictor-66c5568bb-9sm5z

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-fe410-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

sequence-graph-63de8-85847f4f4d-lglk2

Killing

Stopping container sequence-graph-63de8

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-fe410-predictor-6b76d4c7f7

SuccessfulCreate

Created pod: error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

kserve-ci-e2e-test

deployment-controller

success-200-isvc-fe410-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-fe410-predictor-66c5568bb from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Killing

Stopping container kserve-container

kserve-ci-e2e-test

multus

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

AddedInterface

Add eth0 [10.132.0.25/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

error-404-isvc-fe410-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-fe410-predictor-6b76d4c7f7 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

AddedInterface

Add eth0 [10.133.0.39/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-63de8-predictor-544568fdd5-wfwsj

Unhealthy

Readiness probe failed: Get "https://10.133.0.37:8643/healthz": dial tcp 10.133.0.37:8643: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-38d71

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-38d71

InferenceServiceReady

InferenceService [error-404-isvc-38d71] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-38d71

InferenceServiceReady

InferenceService [success-200-isvc-38d71] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-38d71

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

deployment-controller

ensemble-graph-38d71

ScalingReplicaSet

Scaled up replica set ensemble-graph-38d71-7dbc85bf6b from 0 to 1

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-38d71-7dbc85bf6b

SuccessfulCreate

Created pod: ensemble-graph-38d71-7dbc85bf6b-hh2kv

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-38d71

InferenceGraphReady

InferenceGraph [ensemble-graph-38d71] is Ready
(x2)

kserve-ci-e2e-test

kubelet

ensemble-graph-38d71-7dbc85bf6b-hh2kv

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "ensemble-graph-38d71-serving-cert" not found

kserve-ci-e2e-test

kubelet

ensemble-graph-38d71-7dbc85bf6b-hh2kv

Created

Created container: ensemble-graph-38d71

kserve-ci-e2e-test

kubelet

ensemble-graph-38d71-7dbc85bf6b-hh2kv

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

multus

ensemble-graph-38d71-7dbc85bf6b-hh2kv

AddedInterface

Add eth0 [10.134.0.31/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

ensemble-graph-38d71-7dbc85bf6b-hh2kv

Started

Started container ensemble-graph-38d71
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-63de8-85847f4f4d-lglk2

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fe410-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fe410-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fe410-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fe410-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-38d71-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-38d71-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-38d71-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-38d71-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Unhealthy

Readiness probe failed: dial tcp 10.132.0.25:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Unhealthy

Readiness probe failed: dial tcp 10.133.0.39:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fe410

InferenceServiceReady

InferenceService [success-200-isvc-fe410] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fe410

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fe410

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fe410

InferenceServiceReady

InferenceService [error-404-isvc-fe410] is Ready

kserve-ci-e2e-test

multus

sequence-graph-fe410-7759986577-6jbfq

AddedInterface

Add eth0 [10.134.0.32/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-fe410-7759986577-6jbfq

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

kubelet

sequence-graph-fe410-7759986577-6jbfq

Created

Created container: sequence-graph-fe410

kserve-ci-e2e-test

replicaset-controller

sequence-graph-fe410-7759986577

SuccessfulCreate

Created pod: sequence-graph-fe410-7759986577-6jbfq

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-fe410

InferenceGraphReady

InferenceGraph [sequence-graph-fe410] is Ready

kserve-ci-e2e-test

kubelet

sequence-graph-fe410-7759986577-6jbfq

Started

Started container sequence-graph-fe410

kserve-ci-e2e-test

deployment-controller

sequence-graph-fe410

ScalingReplicaSet

Scaled up replica set sequence-graph-fe410-7759986577 from 0 to 1
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-38d71

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-38d71

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fe410-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fe410-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-fe410

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-fe410

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fe410-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fe410-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-fe410

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-fe410

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ab6b8

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-ab6b8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-ab6b8-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ab6b8

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-ab6b8": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-ab6b8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-ab6b8-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-ab6b8-predictor-6857df4dd from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-ab6b8-predictor-6857df4dd

SuccessfulCreate

Created pod: success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-38d71-predictor-c6f86d4b4-w2stk

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ab6b8

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-ab6b8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ab6b8

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-ab6b8": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-ab6b8": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

AddedInterface

Add eth0 [10.132.0.26/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

ensemble-graph-38d71-7dbc85bf6b-hh2kv

Killing

Stopping container ensemble-graph-38d71

kserve-ci-e2e-test

deployment-controller

error-404-isvc-ab6b8-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-ab6b8-predictor-7fb5d66599 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-ab6b8-predictor-7fb5d66599

SuccessfulCreate

Created pod: error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Created

Created container: kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

AddedInterface

Add eth0 [10.133.0.40/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Unhealthy

Readiness probe failed: dial tcp 10.133.0.38:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-38d71-predictor-5cd74bff75-4bl6s

Unhealthy

Readiness probe failed: Get "https://10.133.0.38:8643/healthz": dial tcp 10.133.0.38:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-38d71-7dbc85bf6b-hh2kv

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ab6b8-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ab6b8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ab6b8-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ab6b8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Unhealthy

Readiness probe failed: dial tcp 10.132.0.26:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Unhealthy

Readiness probe failed: dial tcp 10.133.0.40:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-60a5c

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-60a5c": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-60a5c": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-60a5c-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-60a5c-predictor-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

success-200-isvc-60a5c-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-60a5c-predictor-5fbfb7dd4 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-60a5c

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-60a5c": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

error-404-isvc-60a5c-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-60a5c-predictor-74955c648c from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-60a5c

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-60a5c": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-60a5c": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-fe410-predictor-66c5568bb-9sm5z

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-60a5c

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-60a5c": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

sequence-graph-fe410-7759986577-6jbfq

Killing

Stopping container sequence-graph-fe410

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-60a5c-predictor-74955c648c

SuccessfulCreate

Created pod: error-404-isvc-60a5c-predictor-74955c648c-g4rt8

kserve-ci-e2e-test

kubelet

error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-60a5c-predictor-5fbfb7dd4

SuccessfulCreate

Created pod: success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

kserve-ci-e2e-test

multus

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

AddedInterface

Add eth0 [10.132.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

AddedInterface

Add eth0 [10.133.0.41/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ab6b8

InferenceServiceReady

InferenceService [success-200-isvc-ab6b8] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ab6b8

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ab6b8

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ab6b8

InferenceServiceReady

InferenceService [error-404-isvc-ab6b8] is Ready

kserve-ci-e2e-test

kubelet

splitter-graph-ab6b8-54bf769bd5-jnmp5

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "splitter-graph-ab6b8-serving-cert" not found
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ab6b8-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ab6b8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

splitter-graph-ab6b8-54bf769bd5

SuccessfulCreate

Created pod: splitter-graph-ab6b8-54bf769bd5-jnmp5

kserve-ci-e2e-test

deployment-controller

splitter-graph-ab6b8

ScalingReplicaSet

Scaled up replica set splitter-graph-ab6b8-54bf769bd5 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-ab6b8

InferenceGraphReady

InferenceGraph [splitter-graph-ab6b8] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ab6b8-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ab6b8-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-ab6b8-54bf769bd5-jnmp5

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

kubelet

splitter-graph-ab6b8-54bf769bd5-jnmp5

Started

Started container splitter-graph-ab6b8

kserve-ci-e2e-test

multus

splitter-graph-ab6b8-54bf769bd5-jnmp5

AddedInterface

Add eth0 [10.134.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

splitter-graph-ab6b8-54bf769bd5-jnmp5

Created

Created container: splitter-graph-ab6b8
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-fe410-7759986577-6jbfq

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-57557

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-57557": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-57557": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Killing

Stopping container kube-rbac-proxy
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-60a5c-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-60a5c-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-60a5c-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

splitter-graph-ab6b8-54bf769bd5-jnmp5

Killing

Stopping container splitter-graph-ab6b8

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-57557-predictor-7b7cb58b6c

SuccessfulCreate

Created pod: error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

kserve-ci-e2e-test

deployment-controller

error-404-isvc-57557-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-57557-predictor-7b7cb58b6c from 0 to 1
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-60a5c-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-57557

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-57557": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-57557-predictor-7cdcf47b7b

SuccessfulCreate

Created pod: success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

kserve-ci-e2e-test

deployment-controller

success-200-isvc-57557-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-57557-predictor-7cdcf47b7b from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-57557

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-57557": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-57557": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-57557

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-57557": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

AddedInterface

Add eth0 [10.132.0.28/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-57557-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Started

Started container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

AddedInterface

Add eth0 [10.133.0.42/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k

Unhealthy

Readiness probe failed: Get "https://10.132.0.26:8643/healthz": dial tcp 10.132.0.26:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g

Unhealthy

Readiness probe failed: Get "https://10.133.0.40:8643/healthz": dial tcp 10.133.0.40:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Unhealthy

Readiness probe failed: dial tcp 10.133.0.41:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Unhealthy

Readiness probe failed: dial tcp 10.132.0.27:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-60a5c

InferenceServiceReady

InferenceService [error-404-isvc-60a5c] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-60a5c

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-60a5c

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-60a5c

InferenceServiceReady

InferenceService [success-200-isvc-60a5c] is Ready
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-ab6b8-54bf769bd5-jnmp5

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

deployment-controller

switch-graph-60a5c

ScalingReplicaSet

Scaled up replica set switch-graph-60a5c-64b647654d from 0 to 1

kserve-ci-e2e-test

kubelet

switch-graph-60a5c-64b647654d-2knr4

Started

Started container switch-graph-60a5c

kserve-ci-e2e-test

multus

switch-graph-60a5c-64b647654d-2knr4

AddedInterface

Add eth0 [10.134.0.34/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-60a5c-64b647654d-2knr4

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

kubelet

switch-graph-60a5c-64b647654d-2knr4

Created

Created container: switch-graph-60a5c

kserve-ci-e2e-test

replicaset-controller

switch-graph-60a5c-64b647654d

SuccessfulCreate

Created pod: switch-graph-60a5c-64b647654d-2knr4

kserve-ci-e2e-test

InferenceGraphController

switch-graph-60a5c

InferenceGraphReady

InferenceGraph [switch-graph-60a5c] is Ready

kserve-ci-e2e-test

InferenceGraphController

switch-graph-60a5c

UpdateFailed

Failed to update status for InferenceGraph "switch-graph-60a5c": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-60a5c": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-57557-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-57557-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

InferenceGraphController

switch-graph-60a5c

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-60a5c": the object has been modified; please apply your changes to the latest version and try again
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Unhealthy

Readiness probe failed: dial tcp 10.132.0.28:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Unhealthy

Readiness probe failed: dial tcp 10.133.0.42:8080: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-60a5c-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-60a5c-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-60a5c-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-60a5c-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-57557

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-57557

InferenceServiceReady

InferenceService [success-200-isvc-57557] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-57557

InferenceServiceReady

InferenceService [error-404-isvc-57557] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-57557

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-60a5c

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-57557

InferenceGraphReady

InferenceGraph [splitter-graph-57557] is Ready
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-57557-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

splitter-graph-57557

ScalingReplicaSet

Scaled up replica set splitter-graph-57557-86766cd968 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

splitter-graph-57557-86766cd968

SuccessfulCreate

Created pod: splitter-graph-57557-86766cd968-2dzqj
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-57557-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-60a5c

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-57557-86766cd968-2dzqj

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:0169f8fa51f9c0b83c5c9a121f51af9f3e3cdcc48ff1b00c2fcaa12b38a7914f" already present on machine

kserve-ci-e2e-test

kubelet

splitter-graph-57557-86766cd968-2dzqj

Started

Started container splitter-graph-57557

kserve-ci-e2e-test

kubelet

splitter-graph-57557-86766cd968-2dzqj

Created

Created container: splitter-graph-57557

kserve-ci-e2e-test

kubelet

splitter-graph-57557-86766cd968-2dzqj

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "splitter-graph-57557-serving-cert" not found

kserve-ci-e2e-test

multus

splitter-graph-57557-86766cd968-2dzqj

AddedInterface

Add eth0 [10.134.0.35/23] from ovn-kubernetes

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-57557

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-57557-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-57557-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-57557

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

splitter-graph-57557-86766cd968-2dzqj

Killing

Stopping container splitter-graph-57557

kserve-ci-e2e-test

kubelet

error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Unhealthy

Readiness probe failed: Get "https://10.132.0.28:8643/healthz": dial tcp 10.132.0.28:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-57557-predictor-7cdcf47b7b-75nzd

Unhealthy

Readiness probe failed: dial tcp 10.132.0.28:8080: i/o timeout
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-57557-86766cd968-2dzqj

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

switch-graph-60a5c-64b647654d-2knr4

Killing

Stopping container switch-graph-60a5c

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-60a5c-predictor-74955c648c-g4rt8

Killing

Stopping container kserve-container
(x5)

kserve-ci-e2e-test

kubelet

switch-graph-60a5c-64b647654d-2knr4

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503