Time Namespace Component RelatedObject Reason Message

kserve-ci-e2e-test

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-b9878-predictor-65869744dd-gsdxb to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

ensemble-graph-2525b-8997d599-kctgl

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-f4deb-predictor-c578cbf66-dkmz6 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh to ip-10-0-133-209.ec2.internal

kserve-ci-e2e-test

model-chainer-64c4bcbb69-xgd8t

Scheduled

Successfully assigned kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

error-404-isvc-2525b-predictor-76f74d576d-kh277

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-2525b-predictor-76f74d576d-kh277 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

sequence-graph-1ec3f-5f6c4cf864-t84z5

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5 to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

sequence-graph-7fc99-59cc895d65-jhvth

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

sequence-graph-b9878-79b8776d89-hngl7

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7 to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

splitter-graph-600f1-6878b7dd5c-cttln

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

splitter-graph-a4170-65f94cfc4d-7lwc7

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7 to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-1dabb-predictor-68d6866787-m6hgq to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

switch-graph-914c2-55b54cc8f8-kv6sm

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

switch-graph-c12fa-594b845bff-sdhrf

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

switch-graph-f4deb-6fd5579b57-5tldk

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-1ec3f-predictor-677b5997f5-brmms to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-2525b-predictor-67d9995cb7-lcmhq to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-a4170-predictor-59dc6578db-4k6pg to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

ensemble-graph-1dabb-976bfb698-7kt9p

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-914c2-predictor-7db85f97bf-5zqtz to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6 to ip-10-0-133-48.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8 to ip-10-0-133-209.ec2.internal

kserve-ci-e2e-test

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-1dabb-predictor-64d84476b8-kbgzx to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-a4170-predictor-c76b89db5-hv7sv to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-b9878-predictor-fb4f998b7-crfr7 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

error-404-isvc-914c2-predictor-78956bf458-s7292

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-914c2-predictor-78956bf458-s7292 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-1-predictor-79868c757f

SuccessfulCreate

Created pod: isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-1-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-1-predictor-79868c757f from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-914c2-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

isvc-xgboost-graph-predictor-669d8d6456

SuccessfulCreate

Created pod: isvc-xgboost-graph-predictor-669d8d6456-6z8g6

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-914c2-predictor-serving-cert" not found

kserve-ci-e2e-test

multus

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

AddedInterface

Add eth0 [10.133.0.17/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-914c2-predictor-78956bf458

SuccessfulCreate

Created pod: error-404-isvc-914c2-predictor-78956bf458-s7292

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-2-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-2-predictor-7758df598f from 0 to 1

kserve-ci-e2e-test

deployment-controller

success-200-isvc-914c2-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-914c2-predictor-7db85f97bf from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-914c2-predictor-7db85f97bf

SuccessfulCreate

Created pod: success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:b2a48068363ac4e806775467ef2dc5b6c858faecf49ae1f8bafc4262d036f97e"

kserve-ci-e2e-test

deployment-controller

isvc-xgboost-graph-predictor

ScalingReplicaSet

Scaled up replica set isvc-xgboost-graph-predictor-669d8d6456 from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:b2a48068363ac4e806775467ef2dc5b6c858faecf49ae1f8bafc4262d036f97e"

kserve-ci-e2e-test

deployment-controller

error-404-isvc-914c2-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-914c2-predictor-78956bf458 from 0 to 1

kserve-ci-e2e-test

multus

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

AddedInterface

Add eth0 [10.132.0.22/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Pulling

Pulling image "quay.io/opendatahub/success-200-isvc:odh-pr-1449"

kserve-ci-e2e-test

multus

error-404-isvc-914c2-predictor-78956bf458-s7292

AddedInterface

Add eth0 [10.134.0.37/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

AddedInterface

Add eth0 [10.133.0.18/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-2-predictor-7758df598f

SuccessfulCreate

Created pod: isvc-sklearn-graph-2-predictor-7758df598f-nldq8

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Pulling

Pulling image "quay.io/opendatahub/error-404-isvc:odh-pr-1449"

kserve-ci-e2e-test

multus

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

AddedInterface

Add eth0 [10.134.0.36/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:b2a48068363ac4e806775467ef2dc5b6c858faecf49ae1f8bafc4262d036f97e" in 322ms (322ms including waiting). Image size: 299849138 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:b2a48068363ac4e806775467ef2dc5b6c858faecf49ae1f8bafc4262d036f97e" in 3.207s (3.207s including waiting). Image size: 299849138 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:b2a48068363ac4e806775467ef2dc5b6c858faecf49ae1f8bafc4262d036f97e"

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:b2a48068363ac4e806775467ef2dc5b6c858faecf49ae1f8bafc4262d036f97e" in 3.173s (3.173s including waiting). Image size: 299849138 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1449"

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Pulling

Pulling image "kserve/xgbserver:latest"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1449"

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Pulled

Successfully pulled image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" in 15.847s (15.847s including waiting). Image size: 1335702126 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Pulled

Successfully pulled image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" in 15.94s (15.94s including waiting). Image size: 1334716521 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.636s (2.636s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.643s (2.643s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1449" in 15.351s (15.351s including waiting). Image size: 1560926130 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1449" in 14.352s (14.352s including waiting). Image size: 1560926130 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.442s (2.442s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.432s (2.432s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Pulled

Successfully pulled image "kserve/xgbserver:latest" in 17.886s (17.886s including waiting). Image size: 1306417402 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.142s (2.142s including waiting). Image size: 211946088 bytes.
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-914c2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-914c2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-914c2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-914c2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Unhealthy

Readiness probe failed: dial tcp 10.134.0.37:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Unhealthy

Readiness probe failed: dial tcp 10.134.0.36:8080: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-914c2

InferenceServiceReady

InferenceService [error-404-isvc-914c2] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-914c2

InferenceServiceReady

InferenceService [success-200-isvc-914c2] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-914c2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-914c2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-914c2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-914c2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

switch-graph-914c2-55b54cc8f8

SuccessfulCreate

Created pod: switch-graph-914c2-55b54cc8f8-kv6sm

kserve-ci-e2e-test

InferenceGraphController

switch-graph-914c2

InferenceGraphReady

InferenceGraph [switch-graph-914c2] is Ready

kserve-ci-e2e-test

deployment-controller

switch-graph-914c2

ScalingReplicaSet

Scaled up replica set switch-graph-914c2-55b54cc8f8 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

switch-graph-914c2

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-914c2": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

switch-graph-914c2

UpdateFailed

Failed to update status for InferenceGraph "switch-graph-914c2": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "switch-graph-914c2": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

switch-graph-914c2-55b54cc8f8-kv6sm

AddedInterface

Add eth0 [10.132.0.23/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-914c2-55b54cc8f8-kv6sm

Pulling

Pulling image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad"

kserve-ci-e2e-test

kubelet

switch-graph-914c2-55b54cc8f8-kv6sm

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" in 1.812s (1.812s including waiting). Image size: 216346205 bytes.

kserve-ci-e2e-test

kubelet

switch-graph-914c2-55b54cc8f8-kv6sm

Started

Started container switch-graph-914c2

kserve-ci-e2e-test

kubelet

switch-graph-914c2-55b54cc8f8-kv6sm

Created

Created container: switch-graph-914c2
(x8)

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Unhealthy

Readiness probe failed: dial tcp 10.132.0.22:8080: connect: connection refused

kserve-ci-e2e-test

deployment-controller

error-404-isvc-c12fa-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-c12fa-predictor-6c7dd5899d from 0 to 1
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-914c2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-c12fa-predictor-5fb79c554f

SuccessfulCreate

Created pod: success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

kserve-ci-e2e-test

deployment-controller

success-200-isvc-c12fa-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-c12fa-predictor-5fb79c554f from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-c12fa-predictor-6c7dd5899d

SuccessfulCreate

Created pod: error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

kserve-ci-e2e-test

kubelet

switch-graph-914c2-55b54cc8f8-kv6sm

Killing

Stopping container switch-graph-914c2

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-c12fa-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Killing

Stopping container kube-rbac-proxy
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-914c2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c12fa

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-c12fa": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-c12fa-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c12fa

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-c12fa": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-c12fa": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Created

Created container: kserve-container

kserve-ci-e2e-test

multus

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

AddedInterface

Add eth0 [10.134.0.38/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

AddedInterface

Add eth0 [10.134.0.39/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Started

Started container kube-rbac-proxy
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

kubelet

error-404-isvc-914c2-predictor-78956bf458-s7292

Unhealthy

Readiness probe failed: Get "https://10.134.0.37:8643/healthz": dial tcp 10.134.0.37:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

InferenceServiceReady

InferenceService [isvc-sklearn-graph-1] is Ready

kserve-ci-e2e-test

kubelet

success-200-isvc-914c2-predictor-7db85f97bf-5zqtz

Unhealthy

Readiness probe failed: Get "https://10.134.0.36:8643/healthz": dial tcp 10.134.0.36:8643: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

InferenceServiceReady

InferenceService [isvc-sklearn-graph-2] is Ready

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

InferenceServiceReady

InferenceService [isvc-xgboost-graph] is Ready
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-914c2-55b54cc8f8-kv6sm

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

InferenceGraphController

model-chainer

InferenceGraphReady

InferenceGraph [model-chainer] is Ready

kserve-ci-e2e-test

deployment-controller

model-chainer

ScalingReplicaSet

Scaled up replica set model-chainer-64c4bcbb69 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

model-chainer-64c4bcbb69

SuccessfulCreate

Created pod: model-chainer-64c4bcbb69-xgd8t

kserve-ci-e2e-test

kubelet

model-chainer-64c4bcbb69-xgd8t

Started

Started container model-chainer

kserve-ci-e2e-test

kubelet

model-chainer-64c4bcbb69-xgd8t

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

kubelet

model-chainer-64c4bcbb69-xgd8t

Created

Created container: model-chainer

kserve-ci-e2e-test

multus

model-chainer-64c4bcbb69-xgd8t

AddedInterface

Add eth0 [10.132.0.24/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

model-chainer-64c4bcbb69-xgd8t

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "model-chainer-serving-cert" not found
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c12fa-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c12fa-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

model-chainer-64c4bcbb69-xgd8t

Killing

Stopping container model-chainer

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-7fc99-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-7fc99-predictor-5fbf44ffcf

SuccessfulCreate

Created pod: success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

kserve-ci-e2e-test

deployment-controller

success-200-isvc-7fc99-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-7fc99-predictor-5fbf44ffcf from 0 to 1

kserve-ci-e2e-test

deployment-controller

error-404-isvc-7fc99-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-7fc99-predictor-5fcd68d6fb from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-7fc99-predictor-5fcd68d6fb

SuccessfulCreate

Created pod: error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

AddedInterface

Add eth0 [10.134.0.40/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-6z8g6

Unhealthy

Readiness probe failed: Get "https://10.132.0.22:8643/healthz": dial tcp 10.132.0.22:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Started

Started container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

AddedInterface

Add eth0 [10.134.0.41/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Created

Created container: kserve-container
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Unhealthy

Readiness probe failed: dial tcp 10.133.0.17:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-79868c757f-zc9fh

Unhealthy

Readiness probe failed: Get "https://10.133.0.17:8643/healthz": dial tcp 10.133.0.17:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Unhealthy

Readiness probe failed: Get "https://10.133.0.18:8643/healthz": dial tcp 10.133.0.18:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-7758df598f-nldq8

Unhealthy

Readiness probe failed: dial tcp 10.133.0.18:8080: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c12fa-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c12fa-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c12fa

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c12fa

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c12fa

InferenceServiceReady

InferenceService [error-404-isvc-c12fa] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c12fa

InferenceServiceReady

InferenceService [success-200-isvc-c12fa] is Ready

kserve-ci-e2e-test

deployment-controller

switch-graph-c12fa

ScalingReplicaSet

Scaled up replica set switch-graph-c12fa-594b845bff from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-c12fa-594b845bff

SuccessfulCreate

Created pod: switch-graph-c12fa-594b845bff-sdhrf

kserve-ci-e2e-test

kubelet

switch-graph-c12fa-594b845bff-sdhrf

Started

Started container switch-graph-c12fa

kserve-ci-e2e-test

kubelet

switch-graph-c12fa-594b845bff-sdhrf

Created

Created container: switch-graph-c12fa

kserve-ci-e2e-test

kubelet

switch-graph-c12fa-594b845bff-sdhrf

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

multus

switch-graph-c12fa-594b845bff-sdhrf

AddedInterface

Add eth0 [10.132.0.25/23] from ovn-kubernetes

kserve-ci-e2e-test

InferenceGraphController

switch-graph-c12fa

InferenceGraphReady

InferenceGraph [switch-graph-c12fa] is Ready
(x6)

kserve-ci-e2e-test

kubelet

model-chainer-64c4bcbb69-xgd8t

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7fc99-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7fc99-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c12fa-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c12fa-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c12fa-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c12fa-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7fc99

InferenceServiceReady

InferenceService [success-200-isvc-7fc99] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7fc99

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7fc99

InferenceServiceReady

InferenceService [error-404-isvc-7fc99] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7fc99

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-c12fa

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-c12fa

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7fc99-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

sequence-graph-7fc99

ScalingReplicaSet

Scaled up replica set sequence-graph-7fc99-59cc895d65 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-7fc99

InferenceGraphReady

InferenceGraph [sequence-graph-7fc99] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7fc99-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

sequence-graph-7fc99-59cc895d65

SuccessfulCreate

Created pod: sequence-graph-7fc99-59cc895d65-jhvth

kserve-ci-e2e-test

kubelet

sequence-graph-7fc99-59cc895d65-jhvth

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "sequence-graph-7fc99-serving-cert" not found

kserve-ci-e2e-test

multus

sequence-graph-7fc99-59cc895d65-jhvth

AddedInterface

Add eth0 [10.132.0.26/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-7fc99-59cc895d65-jhvth

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

kubelet

sequence-graph-7fc99-59cc895d65-jhvth

Started

Started container sequence-graph-7fc99

kserve-ci-e2e-test

kubelet

sequence-graph-7fc99-59cc895d65-jhvth

Created

Created container: sequence-graph-7fc99
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7fc99-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7fc99-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-7fc99

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-7fc99

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

switch-graph-c12fa-594b845bff-sdhrf

Killing

Stopping container switch-graph-c12fa

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-2525b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-2525b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-2525b-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-2525b-predictor-67d9995cb7

SuccessfulCreate

Created pod: success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

success-200-isvc-2525b-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-2525b-predictor-67d9995cb7 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-2525b-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-2525b-predictor-76f74d576d

SuccessfulCreate

Created pod: error-404-isvc-2525b-predictor-76f74d576d-kh277

kserve-ci-e2e-test

deployment-controller

error-404-isvc-2525b-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-2525b-predictor-76f74d576d from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-2525b

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-2525b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-2525b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-2525b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-2525b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-2525b

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-2525b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-2525b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Killing

Stopping container kserve-container

kserve-ci-e2e-test

multus

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

AddedInterface

Add eth0 [10.134.0.42/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Created

Created container: kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-2525b-predictor-76f74d576d-kh277

AddedInterface

Add eth0 [10.134.0.43/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Started

Started container kube-rbac-proxy
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Unhealthy

Readiness probe failed: dial tcp 10.134.0.39:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Unhealthy

Readiness probe failed: Get "https://10.134.0.38:8643/healthz": dial tcp 10.134.0.38:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-c12fa-predictor-5fb79c554f-vpzgb

Unhealthy

Readiness probe failed: dial tcp 10.134.0.38:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-c12fa-predictor-6c7dd5899d-64n9h

Unhealthy

Readiness probe failed: Get "https://10.134.0.39:8643/healthz": dial tcp 10.134.0.39:8643: connect: connection refused

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-2525b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-2525b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-c12fa-594b845bff-sdhrf

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-2525b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-2525b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Unhealthy

Readiness probe failed: dial tcp 10.134.0.42:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Unhealthy

Readiness probe failed: dial tcp 10.134.0.43:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b9878

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-b9878": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-b9878-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-b9878-predictor-fb4f998b7 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b9878

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-b9878": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-b9878": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

sequence-graph-7fc99-59cc895d65-jhvth

Killing

Stopping container sequence-graph-7fc99

kserve-ci-e2e-test

multus

error-404-isvc-b9878-predictor-65869744dd-gsdxb

AddedInterface

Add eth0 [10.134.0.45/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-b9878-predictor-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

error-404-isvc-b9878-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-b9878-predictor-65869744dd from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-b9878-predictor-65869744dd

SuccessfulCreate

Created pod: error-404-isvc-b9878-predictor-65869744dd-gsdxb

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b9878

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-b9878": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-b9878": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-b9878-predictor-fb4f998b7

SuccessfulCreate

Created pod: success-200-isvc-b9878-predictor-fb4f998b7-crfr7

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b9878

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-b9878": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Started

Started container kube-rbac-proxy
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Unhealthy

Readiness probe failed: dial tcp 10.134.0.40:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

AddedInterface

Add eth0 [10.134.0.44/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-7fc99-predictor-5fbf44ffcf-rllc6

Unhealthy

Readiness probe failed: Get "https://10.134.0.40:8643/healthz": dial tcp 10.134.0.40:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Unhealthy

Readiness probe failed: Get "https://10.134.0.41:8643/healthz": dial tcp 10.134.0.41:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-7fc99-predictor-5fcd68d6fb-tw55n

Unhealthy

Readiness probe failed: dial tcp 10.134.0.41:8080: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-2525b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-2525b

InferenceServiceReady

InferenceService [error-404-isvc-2525b] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-2525b

InferenceServiceReady

InferenceService [success-200-isvc-2525b] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-2525b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-2525b

UpdateFailed

Failed to update status for InferenceGraph "ensemble-graph-2525b": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "ensemble-graph-2525b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

ensemble-graph-2525b-8997d599-kctgl

AddedInterface

Add eth0 [10.132.0.27/23] from ovn-kubernetes
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-2525b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-2525b

InferenceGraphReady

InferenceGraph [ensemble-graph-2525b] is Ready

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-2525b

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "ensemble-graph-2525b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-2525b-8997d599

SuccessfulCreate

Created pod: ensemble-graph-2525b-8997d599-kctgl

kserve-ci-e2e-test

deployment-controller

ensemble-graph-2525b

ScalingReplicaSet

Scaled up replica set ensemble-graph-2525b-8997d599 from 0 to 1
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-2525b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

ensemble-graph-2525b-8997d599-kctgl

Started

Started container ensemble-graph-2525b

kserve-ci-e2e-test

kubelet

ensemble-graph-2525b-8997d599-kctgl

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-2525b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

ensemble-graph-2525b-8997d599-kctgl

Created

Created container: ensemble-graph-2525b
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-2525b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-7fc99-59cc895d65-jhvth

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-1dabb

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-1dabb": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-1dabb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-1dabb-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-1dabb

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-1dabb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-1dabb-predictor-64d84476b8

SuccessfulCreate

Created pod: error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Killing

Stopping container kserve-container
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b9878-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

deployment-controller

error-404-isvc-1dabb-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-1dabb-predictor-64d84476b8 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Killing

Stopping container kube-rbac-proxy
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b9878-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1dabb

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-1dabb": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-1dabb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

ensemble-graph-2525b-8997d599-kctgl

Killing

Stopping container ensemble-graph-2525b

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1dabb

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-1dabb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-1dabb-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-1dabb-predictor-68d6866787 from 0 to 1

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b9878-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-1dabb-predictor-68d6866787

SuccessfulCreate

Created pod: success-200-isvc-1dabb-predictor-68d6866787-m6hgq

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b9878-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

AddedInterface

Add eth0 [10.134.0.46/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

AddedInterface

Add eth0 [10.134.0.47/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-2525b-predictor-67d9995cb7-lcmhq

Unhealthy

Readiness probe failed: Get "https://10.134.0.42:8643/healthz": dial tcp 10.134.0.42:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-2525b-predictor-76f74d576d-kh277

Unhealthy

Readiness probe failed: Get "https://10.134.0.43:8643/healthz": dial tcp 10.134.0.43:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Unhealthy

Readiness probe failed: dial tcp 10.134.0.45:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Unhealthy

Readiness probe failed: dial tcp 10.134.0.44:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b9878

InferenceServiceReady

InferenceService [error-404-isvc-b9878] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-b9878

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b9878

InferenceServiceReady

InferenceService [success-200-isvc-b9878] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-b9878

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-2525b-8997d599-kctgl

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b9878-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1dabb-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1dabb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-b9878

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "sequence-graph-b9878": the object has been modified; please apply your changes to the latest version and try again
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1dabb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1dabb-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-b9878

UpdateFailed

Failed to update status for InferenceGraph "sequence-graph-b9878": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "sequence-graph-b9878": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-b9878

InferenceGraphReady

InferenceGraph [sequence-graph-b9878] is Ready

kserve-ci-e2e-test

deployment-controller

sequence-graph-b9878

ScalingReplicaSet

Scaled up replica set sequence-graph-b9878-79b8776d89 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

sequence-graph-b9878-79b8776d89

SuccessfulCreate

Created pod: sequence-graph-b9878-79b8776d89-hngl7

kserve-ci-e2e-test

kubelet

sequence-graph-b9878-79b8776d89-hngl7

Started

Started container sequence-graph-b9878

kserve-ci-e2e-test

kubelet

sequence-graph-b9878-79b8776d89-hngl7

Created

Created container: sequence-graph-b9878

kserve-ci-e2e-test

kubelet

sequence-graph-b9878-79b8776d89-hngl7

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

multus

sequence-graph-b9878-79b8776d89-hngl7

AddedInterface

Add eth0 [10.132.0.28/23] from ovn-kubernetes
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b9878-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-b9878-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-b9878-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1ec3f

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-1ec3f": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-1ec3f": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-1ec3f-predictor-5c44fc5cc6

SuccessfulCreate

Created pod: error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-1ec3f-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

sequence-graph-b9878-79b8776d89-hngl7

Killing

Stopping container sequence-graph-b9878

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1ec3f

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-1ec3f": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

deployment-controller

success-200-isvc-1ec3f-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-1ec3f-predictor-677b5997f5 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-1ec3f-predictor-677b5997f5

SuccessfulCreate

Created pod: success-200-isvc-1ec3f-predictor-677b5997f5-brmms

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

error-404-isvc-1ec3f-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-1ec3f-predictor-5c44fc5cc6 from 0 to 1

kserve-ci-e2e-test

multus

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

AddedInterface

Add eth0 [10.134.0.48/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

AddedInterface

Add eth0 [10.134.0.49/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-b9878-predictor-65869744dd-gsdxb

Unhealthy

Readiness probe failed: Get "https://10.134.0.45:8643/healthz": dial tcp 10.134.0.45:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-b9878-predictor-fb4f998b7-crfr7

Unhealthy

Readiness probe failed: Get "https://10.134.0.44:8643/healthz": dial tcp 10.134.0.44:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Unhealthy

Readiness probe failed: dial tcp 10.134.0.46:8080: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1dabb

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1dabb

InferenceServiceReady

InferenceService [error-404-isvc-1dabb] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-1dabb

InferenceServiceReady

InferenceService [success-200-isvc-1dabb] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-1dabb

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-b9878-79b8776d89-hngl7

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-1dabb-976bfb698

SuccessfulCreate

Created pod: ensemble-graph-1dabb-976bfb698-7kt9p

kserve-ci-e2e-test

deployment-controller

ensemble-graph-1dabb

ScalingReplicaSet

Scaled up replica set ensemble-graph-1dabb-976bfb698 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-1dabb

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "ensemble-graph-1dabb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

ensemble-graph-1dabb-976bfb698-7kt9p

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "ensemble-graph-1dabb-serving-cert" not found

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-1dabb

UpdateFailed

Failed to update status for InferenceGraph "ensemble-graph-1dabb": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "ensemble-graph-1dabb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-1dabb

InferenceGraphReady

InferenceGraph [ensemble-graph-1dabb] is Ready

kserve-ci-e2e-test

multus

ensemble-graph-1dabb-976bfb698-7kt9p

AddedInterface

Add eth0 [10.132.0.29/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

ensemble-graph-1dabb-976bfb698-7kt9p

Started

Started container ensemble-graph-1dabb

kserve-ci-e2e-test

kubelet

ensemble-graph-1dabb-976bfb698-7kt9p

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

kubelet

ensemble-graph-1dabb-976bfb698-7kt9p

Created

Created container: ensemble-graph-1dabb
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1dabb-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1dabb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1dabb-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1dabb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Unhealthy

Readiness probe failed: dial tcp 10.134.0.49:8080: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1ec3f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-1dabb

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1ec3f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-1dabb

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1ec3f-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1ec3f-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-1ec3f

InferenceServiceReady

InferenceService [success-200-isvc-1ec3f] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-1ec3f

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1ec3f

InferenceServiceReady

InferenceService [error-404-isvc-1ec3f] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-1ec3f

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-1dabb

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-1dabb

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

sequence-graph-1ec3f

ScalingReplicaSet

Scaled up replica set sequence-graph-1ec3f-5f6c4cf864 from 0 to 1

kserve-ci-e2e-test

kubelet

sequence-graph-1ec3f-5f6c4cf864-t84z5

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "sequence-graph-1ec3f-serving-cert" not found

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-1ec3f

InferenceGraphReady

InferenceGraph [sequence-graph-1ec3f] is Ready

kserve-ci-e2e-test

replicaset-controller

sequence-graph-1ec3f-5f6c4cf864

SuccessfulCreate

Created pod: sequence-graph-1ec3f-5f6c4cf864-t84z5

kserve-ci-e2e-test

kubelet

sequence-graph-1ec3f-5f6c4cf864-t84z5

Started

Started container sequence-graph-1ec3f

kserve-ci-e2e-test

kubelet

sequence-graph-1ec3f-5f6c4cf864-t84z5

Created

Created container: sequence-graph-1ec3f

kserve-ci-e2e-test

kubelet

sequence-graph-1ec3f-5f6c4cf864-t84z5

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

multus

sequence-graph-1ec3f-5f6c4cf864-t84z5

AddedInterface

Add eth0 [10.132.0.30/23] from ovn-kubernetes
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1ec3f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-1ec3f-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1ec3f-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-1ec3f-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-1ec3f

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-1ec3f

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

deployment-controller

success-200-isvc-a4170-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-a4170-predictor-c76b89db5 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a4170

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-a4170": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a4170

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-a4170": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a4170

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-a4170": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-a4170": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-a4170-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-a4170-predictor-59dc6578db from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a4170

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-a4170": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-a4170": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

AddedInterface

Add eth0 [10.134.0.51/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Created

Created container: kserve-container

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-a4170-predictor-c76b89db5

SuccessfulCreate

Created pod: success-200-isvc-a4170-predictor-c76b89db5-hv7sv

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

ensemble-graph-1dabb-976bfb698-7kt9p

Killing

Stopping container ensemble-graph-1dabb

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-a4170-predictor-59dc6578db

SuccessfulCreate

Created pod: error-404-isvc-a4170-predictor-59dc6578db-4k6pg

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-1dabb-predictor-68d6866787-m6hgq

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-a4170-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

AddedInterface

Add eth0 [10.134.0.50/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Started

Started container kube-rbac-proxy
(x8)

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Unhealthy

Readiness probe failed: dial tcp 10.134.0.47:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-1dabb-predictor-64d84476b8-kbgzx

Unhealthy

Readiness probe failed: Get "https://10.134.0.47:8643/healthz": dial tcp 10.134.0.47:8643: connect: connection refused

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a4170-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a4170-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-1dabb-976bfb698-7kt9p

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f4deb

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-f4deb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-f4deb-predictor-55df7bbd7c

SuccessfulCreate

Created pod: success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-f4deb-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f4deb

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-f4deb": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-f4deb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f4deb

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-f4deb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

sequence-graph-1ec3f-5f6c4cf864-t84z5

Killing

Stopping container sequence-graph-1ec3f

kserve-ci-e2e-test

multus

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

AddedInterface

Add eth0 [10.134.0.53/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Killing

Stopping container kube-rbac-proxy
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a4170-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a4170-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

error-404-isvc-1ec3f-predictor-5c44fc5cc6-svnjx

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-f4deb-predictor-c578cbf66

SuccessfulCreate

Created pod: error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

kserve-ci-e2e-test

deployment-controller

error-404-isvc-f4deb-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-f4deb-predictor-c578cbf66 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f4deb

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-f4deb": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-f4deb": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-f4deb-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-f4deb-predictor-55df7bbd7c from 0 to 1

kserve-ci-e2e-test

multus

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

AddedInterface

Add eth0 [10.134.0.52/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Started

Started container kserve-container
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Unhealthy

Readiness probe failed: dial tcp 10.134.0.48:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-1ec3f-predictor-677b5997f5-brmms

Unhealthy

Readiness probe failed: Get "https://10.134.0.48:8643/healthz": dial tcp 10.134.0.48:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Unhealthy

Readiness probe failed: dial tcp 10.134.0.51:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Unhealthy

Readiness probe failed: dial tcp 10.134.0.50:8080: connect: connection refused

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f4deb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f4deb-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a4170

InferenceServiceReady

InferenceService [error-404-isvc-a4170] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a4170

InferenceServiceReady

InferenceService [success-200-isvc-a4170] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a4170

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a4170

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-1ec3f-5f6c4cf864-t84z5

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

splitter-graph-a4170-65f94cfc4d-7lwc7

Created

Created container: splitter-graph-a4170

kserve-ci-e2e-test

replicaset-controller

splitter-graph-a4170-65f94cfc4d

SuccessfulCreate

Created pod: splitter-graph-a4170-65f94cfc4d-7lwc7
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a4170-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a4170-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a4170-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

multus

splitter-graph-a4170-65f94cfc4d-7lwc7

AddedInterface

Add eth0 [10.132.0.31/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

splitter-graph-a4170-65f94cfc4d-7lwc7

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a4170-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-a4170-65f94cfc4d-7lwc7

Started

Started container splitter-graph-a4170

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-a4170

InferenceGraphReady

InferenceGraph [splitter-graph-a4170] is Ready

kserve-ci-e2e-test

deployment-controller

splitter-graph-a4170

ScalingReplicaSet

Scaled up replica set splitter-graph-a4170-65f94cfc4d from 0 to 1
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Unhealthy

Readiness probe failed: dial tcp 10.134.0.53:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Unhealthy

Readiness probe failed: dial tcp 10.134.0.52:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

splitter-graph-a4170-65f94cfc4d-7lwc7

Killing

Stopping container splitter-graph-a4170

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-600f1-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-600f1

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-600f1": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-600f1": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-600f1-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-600f1-predictor-6d4bfcf845 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-600f1-predictor-7bcf7b8fd6

SuccessfulCreate

Created pod: error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

kserve-ci-e2e-test

deployment-controller

error-404-isvc-600f1-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-600f1-predictor-7bcf7b8fd6 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-600f1

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-600f1": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-600f1-predictor-6d4bfcf845

SuccessfulCreate

Created pod: success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

kserve-ci-e2e-test

multus

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

AddedInterface

Add eth0 [10.134.0.54/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1449" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Started

Started container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

AddedInterface

Add eth0 [10.134.0.55/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-a4170-predictor-59dc6578db-4k6pg

Unhealthy

Readiness probe failed: Get "https://10.134.0.51:8643/healthz": dial tcp 10.134.0.51:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-a4170-predictor-c76b89db5-hv7sv

Unhealthy

Readiness probe failed: Get "https://10.134.0.50:8643/healthz": dial tcp 10.134.0.50:8643: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f4deb-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f4deb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f4deb

InferenceServiceReady

InferenceService [error-404-isvc-f4deb] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f4deb

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f4deb

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f4deb

InferenceServiceReady

InferenceService [success-200-isvc-f4deb] is Ready

kserve-ci-e2e-test

kubelet

switch-graph-f4deb-6fd5579b57-5tldk

Started

Started container switch-graph-f4deb

kserve-ci-e2e-test

replicaset-controller

switch-graph-f4deb-6fd5579b57

SuccessfulCreate

Created pod: switch-graph-f4deb-6fd5579b57-5tldk

kserve-ci-e2e-test

InferenceGraphController

switch-graph-f4deb

InferenceGraphReady

InferenceGraph [switch-graph-f4deb] is Ready

kserve-ci-e2e-test

multus

switch-graph-f4deb-6fd5579b57-5tldk

AddedInterface

Add eth0 [10.132.0.32/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

switch-graph-f4deb

ScalingReplicaSet

Scaled up replica set switch-graph-f4deb-6fd5579b57 from 0 to 1

kserve-ci-e2e-test

kubelet

switch-graph-f4deb-6fd5579b57-5tldk

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

kubelet

switch-graph-f4deb-6fd5579b57-5tldk

Created

Created container: switch-graph-f4deb
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-a4170-65f94cfc4d-7lwc7

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-600f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-600f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-600f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-600f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f4deb-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f4deb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-f4deb

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-f4deb

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-600f1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-600f1

InferenceServiceReady

InferenceService [success-200-isvc-600f1] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-600f1

InferenceServiceReady

InferenceService [error-404-isvc-600f1] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-600f1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f4deb-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f4deb-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-f4deb

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-f4deb

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

splitter-graph-600f1-6878b7dd5c

SuccessfulCreate

Created pod: splitter-graph-600f1-6878b7dd5c-cttln

kserve-ci-e2e-test

deployment-controller

splitter-graph-600f1

ScalingReplicaSet

Scaled up replica set splitter-graph-600f1-6878b7dd5c from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-600f1

InferenceGraphReady

InferenceGraph [splitter-graph-600f1] is Ready

kserve-ci-e2e-test

kubelet

splitter-graph-600f1-6878b7dd5c-cttln

Started

Started container splitter-graph-600f1

kserve-ci-e2e-test

multus

splitter-graph-600f1-6878b7dd5c-cttln

AddedInterface

Add eth0 [10.132.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

splitter-graph-600f1-6878b7dd5c-cttln

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:b42fc184cd9997665c45700486ec1c9301c424ede94aa53d528583b0fe44a7ad" already present on machine

kserve-ci-e2e-test

kubelet

splitter-graph-600f1-6878b7dd5c-cttln

Created

Created container: splitter-graph-600f1
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-600f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-600f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-600f1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-600f1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-600f1

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-600f1

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-600f1

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-600f1

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-600f1-6878b7dd5c-cttln

Killing

Stopping container splitter-graph-600f1

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Killing

Stopping container kube-rbac-proxy
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Unhealthy

Readiness probe failed: dial tcp 10.134.0.55:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Unhealthy

Readiness probe failed: dial tcp 10.134.0.54:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-600f1-predictor-7bcf7b8fd6-f6xnj

Unhealthy

Readiness probe failed: Get "https://10.134.0.55:8643/healthz": dial tcp 10.134.0.55:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-600f1-predictor-6d4bfcf845-sdtdv

Unhealthy

Readiness probe failed: Get "https://10.134.0.54:8643/healthz": dial tcp 10.134.0.54:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-600f1-6878b7dd5c-cttln

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

switch-graph-f4deb-6fd5579b57-5tldk

Killing

Stopping container switch-graph-f4deb

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-f4deb-predictor-c578cbf66-dkmz6

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f4deb-predictor-55df7bbd7c-rd4c9

Killing

Stopping container kserve-container
(x5)

kserve-ci-e2e-test

kubelet

switch-graph-f4deb-6fd5579b57-5tldk

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503