Time Namespace Component RelatedObject Reason Message

kserve-ci-e2e-test

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-41c4d-predictor-ffc88656f-4gvqb to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-9e0ff-predictor-598855d998-zbbsz to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-f6018-predictor-6c99ffbf99-pllwl to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qgmdv to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f to ip-10-0-129-144.ec2.internal

kserve-ci-e2e-test

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-8b822-predictor-74448fdcdb-6sqxm to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-7afa6-predictor-7549c68964-lvtm5 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-6941b-predictor-79dfb7969c-rn424 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

ensemble-graph-c68dd-c6797d99d-xsj96

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-c68dd-c6797d99d-xsj96 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

switch-graph-9e0ff-7bdf4d4867-lckwb

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-9e0ff-7bdf4d4867-lckwb to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

switch-graph-6941b-869c44b684-24nwx

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-6941b-869c44b684-24nwx to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

splitter-graph-7afa6-7fb6d7495d-5g5l4

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-7afa6-7fb6d7495d-5g5l4 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-d0d22-predictor-65f777864b-fb2rr to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

splitter-graph-41c4d-654b49775c-n9pd5

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-41c4d-654b49775c-n9pd5 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

sequence-graph-f6018-777885f489-bxxmg

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-f6018-777885f489-bxxmg to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

sequence-graph-d0d22-77bb574b5d-lkw4b

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-d0d22-77bb574b5d-lkw4b to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl to ip-10-0-138-170.ec2.internal

kserve-ci-e2e-test

sequence-graph-8b822-89658595d-trzvx

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-8b822-89658595d-trzvx to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-d0d22-predictor-76db56b8f-hz2ck to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

switch-graph-ce4d4-66ccfb86c4-sd5b6

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-ce4d4-66ccfb86c4-sd5b6 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-f6018-predictor-55f969cd7b-vv7hw to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

ensemble-graph-ac602-7c945b5b88-wwhgf

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-ac602-7c945b5b88-wwhgf to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-ce4d4-predictor-5d646b757d-msdzr to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-c68dd-predictor-57f664bd48-xzcc6 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

success-200-isvc-ac602-predictor-757bd75446-krh9b

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-ac602-predictor-757bd75446-krh9b to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

model-chainer-86b99c7c88-lngx8

Scheduled

Successfully assigned kserve-ci-e2e-test/model-chainer-86b99c7c88-lngx8 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-7afa6-predictor-5f65ffc855-lhn82 to ip-10-0-138-137.ec2.internal

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-1-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-1-predictor-5b497dcd98 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-sklearn-graph-1": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-9e0ff-predictor-6655bb8c8f

SuccessfulCreate

Created pod: error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "isvc-xgboost-graph-predictor-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

success-200-isvc-9e0ff-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-9e0ff-predictor-598855d998 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-9e0ff-predictor-598855d998

SuccessfulCreate

Created pod: success-200-isvc-9e0ff-predictor-598855d998-zbbsz

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

UpdateFailed

Failed to update status for InferenceService "isvc-sklearn-graph-1": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-sklearn-graph-1": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-2-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-2-predictor-847f6cf74b from 0 to 1

kserve-ci-e2e-test

deployment-controller

isvc-xgboost-graph-predictor

ScalingReplicaSet

Scaled up replica set isvc-xgboost-graph-predictor-669d8d6456 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

isvc-xgboost-graph-predictor-669d8d6456

SuccessfulCreate

Created pod: isvc-xgboost-graph-predictor-669d8d6456-qgmdv

kserve-ci-e2e-test

deployment-controller

error-404-isvc-9e0ff-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-9e0ff-predictor-6655bb8c8f from 0 to 1

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-1-predictor-5b497dcd98

SuccessfulCreate

Created pod: isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0cd196d4c53b891914316f18ab5cfa9f85258e057f3687e65332c70bf642d22d"

kserve-ci-e2e-test

multus

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

AddedInterface

Add eth0 [10.133.0.32/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Pulling

Pulling image "quay.io/opendatahub/success-200-isvc:odh-pr-1293"

kserve-ci-e2e-test

multus

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

AddedInterface

Add eth0 [10.132.0.20/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0cd196d4c53b891914316f18ab5cfa9f85258e057f3687e65332c70bf642d22d"

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Pulling

Pulling image "quay.io/opendatahub/error-404-isvc:odh-pr-1293"

kserve-ci-e2e-test

multus

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

AddedInterface

Add eth0 [10.132.0.21/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-2-predictor-847f6cf74b

SuccessfulCreate

Created pod: isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

kserve-ci-e2e-test

multus

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

AddedInterface

Add eth0 [10.134.0.23/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0cd196d4c53b891914316f18ab5cfa9f85258e057f3687e65332c70bf642d22d"

kserve-ci-e2e-test

multus

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

AddedInterface

Add eth0 [10.132.0.22/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0cd196d4c53b891914316f18ab5cfa9f85258e057f3687e65332c70bf642d22d" in 4.132s (4.132s including waiting). Image size: 301288360 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0cd196d4c53b891914316f18ab5cfa9f85258e057f3687e65332c70bf642d22d" in 3.251s (3.251s including waiting). Image size: 301288360 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0cd196d4c53b891914316f18ab5cfa9f85258e057f3687e65332c70bf642d22d" in 5.276s (5.276s including waiting). Image size: 301288360 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293"

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Pulling

Pulling image "kserve/xgbserver:latest"

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Pulled

Successfully pulled image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" in 14.351s (14.351s including waiting). Image size: 1332288608 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Pulled

Successfully pulled image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" in 14.797s (14.797s including waiting). Image size: 1332303459 bytes.

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 3.993s (3.993s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 3.994s (3.994s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293" in 12.425s (12.425s including waiting). Image size: 1560612266 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1293" in 12.919s (12.919s including waiting). Image size: 1560612266 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.127s (2.127s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.109s (2.109s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Pulled

Successfully pulled image "kserve/xgbserver:latest" in 22.822s (22.822s including waiting). Image size: 1306229499 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Created

Created container: kube-rbac-proxy
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9e0ff-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9e0ff-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9e0ff-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9e0ff-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-9e0ff

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9e0ff

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-9e0ff

InferenceServiceReady

InferenceService [error-404-isvc-9e0ff] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9e0ff

InferenceServiceReady

InferenceService [success-200-isvc-9e0ff] is Ready

kserve-ci-e2e-test

kubelet

switch-graph-9e0ff-7bdf4d4867-lckwb

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "switch-graph-9e0ff-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

switch-graph-9e0ff-7bdf4d4867

SuccessfulCreate

Created pod: switch-graph-9e0ff-7bdf4d4867-lckwb

kserve-ci-e2e-test

deployment-controller

switch-graph-9e0ff

ScalingReplicaSet

Scaled up replica set switch-graph-9e0ff-7bdf4d4867 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

switch-graph-9e0ff

InferenceGraphReady

InferenceGraph [switch-graph-9e0ff] is Ready

kserve-ci-e2e-test

kubelet

switch-graph-9e0ff-7bdf4d4867-lckwb

Pulling

Pulling image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422"

kserve-ci-e2e-test

multus

switch-graph-9e0ff-7bdf4d4867-lckwb

AddedInterface

Add eth0 [10.132.0.23/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-9e0ff-7bdf4d4867-lckwb

Created

Created container: switch-graph-9e0ff

kserve-ci-e2e-test

kubelet

switch-graph-9e0ff-7bdf4d4867-lckwb

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" in 2.216s (2.216s including waiting). Image size: 216233549 bytes.

kserve-ci-e2e-test

kubelet

switch-graph-9e0ff-7bdf4d4867-lckwb

Started

Started container switch-graph-9e0ff
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9e0ff-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9e0ff-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9e0ff-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9e0ff-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

InferenceServiceReady

InferenceService [isvc-sklearn-graph-2] is Ready
(x10)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

InferenceServiceReady

InferenceService [isvc-sklearn-graph-1] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ce4d4

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-ce4d4": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-ce4d4": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-ce4d4-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-ce4d4-predictor-5d646b757d from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-ce4d4-predictor-5d646b757d

SuccessfulCreate

Created pod: success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ce4d4

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-ce4d4": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

switch-graph-9e0ff-7bdf4d4867-lckwb

Killing

Stopping container switch-graph-9e0ff

kserve-ci-e2e-test

deployment-controller

error-404-isvc-ce4d4-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-ce4d4-predictor-5d96695c7c from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ce4d4

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-ce4d4": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ce4d4

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-ce4d4": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-ce4d4": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-ce4d4-predictor-5d96695c7c

SuccessfulCreate

Created pod: error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

AddedInterface

Add eth0 [10.132.0.24/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

AddedInterface

Add eth0 [10.132.0.25/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Started

Started container kube-rbac-proxy
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Unhealthy

Readiness probe failed: dial tcp 10.132.0.22:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-9e0ff-predictor-6655bb8c8f-7m29l

Unhealthy

Readiness probe failed: Get "https://10.132.0.22:8643/healthz": dial tcp 10.132.0.22:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Unhealthy

Readiness probe failed: dial tcp 10.132.0.20:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-9e0ff-predictor-598855d998-zbbsz

Unhealthy

Readiness probe failed: Get "https://10.132.0.20:8643/healthz": dial tcp 10.132.0.20:8643: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

InferenceServiceReady

InferenceService [isvc-xgboost-graph] is Ready
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

model-chainer-86b99c7c88-lngx8

Created

Created container: model-chainer
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-9e0ff-7bdf4d4867-lckwb

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

InferenceGraphController

model-chainer

InferenceGraphReady

InferenceGraph [model-chainer] is Ready

kserve-ci-e2e-test

deployment-controller

model-chainer

ScalingReplicaSet

Scaled up replica set model-chainer-86b99c7c88 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

model-chainer-86b99c7c88

SuccessfulCreate

Created pod: model-chainer-86b99c7c88-lngx8

kserve-ci-e2e-test

kubelet

model-chainer-86b99c7c88-lngx8

Started

Started container model-chainer

kserve-ci-e2e-test

kubelet

model-chainer-86b99c7c88-lngx8

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

multus

model-chainer-86b99c7c88-lngx8

AddedInterface

Add eth0 [10.132.0.26/23] from ovn-kubernetes
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-8b822-predictor-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

success-200-isvc-8b822-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-8b822-predictor-74448fdcdb from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-8b822-predictor-74448fdcdb

SuccessfulCreate

Created pod: success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Started

Started container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-8b822-predictor-7b8bf795f7

SuccessfulCreate

Created pod: error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-8b822

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-8b822": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-8b822

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-8b822": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-8b822": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-8b822-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-8b822-predictor-7b8bf795f7 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

model-chainer-86b99c7c88-lngx8

Killing

Stopping container model-chainer

kserve-ci-e2e-test

multus

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

AddedInterface

Add eth0 [10.132.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-8b822

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-8b822": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-8b822

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-8b822": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-8b822": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

AddedInterface

Add eth0 [10.132.0.28/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Created

Created container: kube-rbac-proxy
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Unhealthy

Readiness probe failed: dial tcp 10.134.0.23:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Unhealthy

Readiness probe failed: Get "https://10.133.0.32:8643/healthz": dial tcp 10.133.0.32:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f

Unhealthy

Readiness probe failed: dial tcp 10.133.0.32:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-847f6cf74b-88pwl

Unhealthy

Readiness probe failed: Get "https://10.134.0.23:8643/healthz": dial tcp 10.134.0.23:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Unhealthy

Readiness probe failed: Get "https://10.132.0.21:8643/healthz": dial tcp 10.132.0.21:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-qgmdv

Unhealthy

Readiness probe failed: dial tcp 10.132.0.21:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ce4d4

InferenceServiceReady

InferenceService [success-200-isvc-ce4d4] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ce4d4

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ce4d4

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ce4d4

InferenceServiceReady

InferenceService [error-404-isvc-ce4d4] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ce4d4-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ce4d4-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

switch-graph-ce4d4

InferenceGraphReady

InferenceGraph [switch-graph-ce4d4] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ce4d4-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ce4d4-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

switch-graph-ce4d4

ScalingReplicaSet

Scaled up replica set switch-graph-ce4d4-66ccfb86c4 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-ce4d4-66ccfb86c4

SuccessfulCreate

Created pod: switch-graph-ce4d4-66ccfb86c4-sd5b6

kserve-ci-e2e-test

kubelet

switch-graph-ce4d4-66ccfb86c4-sd5b6

Created

Created container: switch-graph-ce4d4

kserve-ci-e2e-test

kubelet

switch-graph-ce4d4-66ccfb86c4-sd5b6

Started

Started container switch-graph-ce4d4

kserve-ci-e2e-test

kubelet

switch-graph-ce4d4-66ccfb86c4-sd5b6

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

multus

switch-graph-ce4d4-66ccfb86c4-sd5b6

AddedInterface

Add eth0 [10.132.0.29/23] from ovn-kubernetes
(x6)

kserve-ci-e2e-test

kubelet

model-chainer-86b99c7c88-lngx8

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ce4d4-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ce4d4-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ce4d4-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ce4d4-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-8b822-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-8b822-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-8b822

InferenceServiceReady

InferenceService [success-200-isvc-8b822] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-8b822

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-8b822

InferenceServiceReady

InferenceService [error-404-isvc-8b822] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-8b822

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

kubelet

sequence-graph-8b822-89658595d-trzvx

Started

Started container sequence-graph-8b822
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-8b822-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

multus

sequence-graph-8b822-89658595d-trzvx

AddedInterface

Add eth0 [10.132.0.30/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-8b822-89658595d-trzvx

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-8b822-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-8b822

InferenceGraphReady

InferenceGraph [sequence-graph-8b822] is Ready

kserve-ci-e2e-test

deployment-controller

sequence-graph-8b822

ScalingReplicaSet

Scaled up replica set sequence-graph-8b822-89658595d from 0 to 1

kserve-ci-e2e-test

kubelet

sequence-graph-8b822-89658595d-trzvx

Created

Created container: sequence-graph-8b822

kserve-ci-e2e-test

replicaset-controller

sequence-graph-8b822-89658595d

SuccessfulCreate

Created pod: sequence-graph-8b822-89658595d-trzvx
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-ce4d4

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-ce4d4

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-8b822-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-8b822

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-8b822

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-8b822-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-8b822

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-8b822

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-8b822-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-8b822-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

success-200-isvc-ac602-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-ac602-predictor-757bd75446 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ac602

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-ac602": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-ac602": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-ac602-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-ac602-predictor-67b8b47b5c from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-ac602-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ac602

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-ac602": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-ac602": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-ac602-predictor-757bd75446

SuccessfulCreate

Created pod: success-200-isvc-ac602-predictor-757bd75446-krh9b

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-ac602-predictor-67b8b47b5c

SuccessfulCreate

Created pod: error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-ac602-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ac602

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-ac602": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ac602

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-ac602": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

switch-graph-ce4d4-66ccfb86c4-sd5b6

Killing

Stopping container switch-graph-ce4d4

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-ac602-predictor-757bd75446-krh9b

AddedInterface

Add eth0 [10.132.0.31/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Started

Started container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

AddedInterface

Add eth0 [10.132.0.32/23] from ovn-kubernetes
(x8)

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Unhealthy

Readiness probe failed: dial tcp 10.132.0.24:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Unhealthy

Readiness probe failed: Get "https://10.132.0.25:8643/healthz": dial tcp 10.132.0.25:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-ce4d4-predictor-5d646b757d-msdzr

Unhealthy

Readiness probe failed: Get "https://10.132.0.24:8643/healthz": dial tcp 10.132.0.24:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-ce4d4-predictor-5d96695c7c-7bsvh

Unhealthy

Readiness probe failed: dial tcp 10.132.0.25:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

switch-graph-ce4d4-66ccfb86c4-sd5b6

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "switch-graph-ce4d4-serving-cert" not found
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-ce4d4-66ccfb86c4-sd5b6

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ac602-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ac602-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ac602-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ac602-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d0d22

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-d0d22": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-d0d22": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d0d22

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-d0d22": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-d0d22": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d0d22

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-d0d22": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-d0d22-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-d0d22-predictor-65f777864b from 0 to 1

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-d0d22-predictor-65f777864b

SuccessfulCreate

Created pod: success-200-isvc-d0d22-predictor-65f777864b-fb2rr

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d0d22

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-d0d22": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-d0d22-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-d0d22-predictor-76db56b8f from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-d0d22-predictor-76db56b8f

SuccessfulCreate

Created pod: error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

kserve-ci-e2e-test

kubelet

sequence-graph-8b822-89658595d-trzvx

Killing

Stopping container sequence-graph-8b822

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-d0d22-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Started

Started container kserve-container

kserve-ci-e2e-test

multus

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

AddedInterface

Add eth0 [10.132.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

multus

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

AddedInterface

Add eth0 [10.132.0.34/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Unhealthy

Readiness probe failed: dial tcp 10.132.0.27:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-8b822-predictor-74448fdcdb-6sqxm

Unhealthy

Readiness probe failed: Get "https://10.132.0.27:8643/healthz": dial tcp 10.132.0.27:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Unhealthy

Readiness probe failed: dial tcp 10.132.0.32:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Unhealthy

Readiness probe failed: dial tcp 10.132.0.28:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-8b822-predictor-7b8bf795f7-l6kw5

Unhealthy

Readiness probe failed: Get "https://10.132.0.28:8643/healthz": dial tcp 10.132.0.28:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Unhealthy

Readiness probe failed: dial tcp 10.132.0.31:8080: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ac602

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-ac602

InferenceServiceReady

InferenceService [error-404-isvc-ac602] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ac602

InferenceServiceReady

InferenceService [success-200-isvc-ac602] is Ready
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-ac602

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-8b822-89658595d-trzvx

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

deployment-controller

ensemble-graph-ac602

ScalingReplicaSet

Scaled up replica set ensemble-graph-ac602-7c945b5b88 from 0 to 1

kserve-ci-e2e-test

multus

ensemble-graph-ac602-7c945b5b88-wwhgf

AddedInterface

Add eth0 [10.132.0.35/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-ac602-7c945b5b88

SuccessfulCreate

Created pod: ensemble-graph-ac602-7c945b5b88-wwhgf

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-ac602

InferenceGraphReady

InferenceGraph [ensemble-graph-ac602] is Ready

kserve-ci-e2e-test

kubelet

ensemble-graph-ac602-7c945b5b88-wwhgf

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

kubelet

ensemble-graph-ac602-7c945b5b88-wwhgf

Started

Started container ensemble-graph-ac602

kserve-ci-e2e-test

kubelet

ensemble-graph-ac602-7c945b5b88-wwhgf

Created

Created container: ensemble-graph-ac602
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ac602-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ac602-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-ac602-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-ac602-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c68dd

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-c68dd": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-c68dd-predictor-57f664bd48

SuccessfulCreate

Created pod: success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

ensemble-graph-ac602-7c945b5b88-wwhgf

Killing

Stopping container ensemble-graph-ac602

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

success-200-isvc-c68dd-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-c68dd-predictor-57f664bd48 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c68dd

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-c68dd": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-c68dd": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-c68dd-predictor-9956c4bc9

SuccessfulCreate

Created pod: error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

kserve-ci-e2e-test

deployment-controller

error-404-isvc-c68dd-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-c68dd-predictor-9956c4bc9 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c68dd

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-c68dd": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c68dd

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-c68dd": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-c68dd": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

AddedInterface

Add eth0 [10.132.0.36/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

AddedInterface

Add eth0 [10.132.0.37/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Created

Created container: kube-rbac-proxy
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Unhealthy

Readiness probe failed: dial tcp 10.132.0.33:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-ac602-predictor-757bd75446-krh9b

Unhealthy

Readiness probe failed: Get "https://10.132.0.31:8643/healthz": dial tcp 10.132.0.31:8643: connect: connection refused
(x8)

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Unhealthy

Readiness probe failed: dial tcp 10.132.0.34:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-ac602-predictor-67b8b47b5c-d2b2w

Unhealthy

Readiness probe failed: Get "https://10.132.0.32:8643/healthz": context deadline exceeded

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d0d22

InferenceServiceReady

InferenceService [success-200-isvc-d0d22] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d0d22

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d0d22

InferenceServiceReady

InferenceService [error-404-isvc-d0d22] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d0d22

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d0d22-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d0d22-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d0d22-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d0d22-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

sequence-graph-d0d22

ScalingReplicaSet

Scaled up replica set sequence-graph-d0d22-77bb574b5d from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-d0d22

InferenceGraphReady

InferenceGraph [sequence-graph-d0d22] is Ready

kserve-ci-e2e-test

replicaset-controller

sequence-graph-d0d22-77bb574b5d

SuccessfulCreate

Created pod: sequence-graph-d0d22-77bb574b5d-lkw4b

kserve-ci-e2e-test

kubelet

sequence-graph-d0d22-77bb574b5d-lkw4b

Started

Started container sequence-graph-d0d22

kserve-ci-e2e-test

kubelet

sequence-graph-d0d22-77bb574b5d-lkw4b

Created

Created container: sequence-graph-d0d22

kserve-ci-e2e-test

multus

sequence-graph-d0d22-77bb574b5d-lkw4b

AddedInterface

Add eth0 [10.132.0.38/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-d0d22-77bb574b5d-lkw4b

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-ac602-7c945b5b88-wwhgf

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d0d22-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d0d22-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d0d22-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d0d22-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

sequence-graph-d0d22-77bb574b5d-lkw4b

Killing

Stopping container sequence-graph-d0d22

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-f6018-predictor-6c99ffbf99

SuccessfulCreate

Created pod: error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f6018

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-f6018": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f6018

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-f6018": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-f6018": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f6018

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-f6018": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f6018

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-f6018": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-f6018": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-f6018-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-f6018-predictor-6c99ffbf99 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-d0d22-predictor-76db56b8f-hz2ck

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-f6018-predictor-serving-cert" not found

kserve-ci-e2e-test

multus

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

AddedInterface

Add eth0 [10.132.0.40/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-f6018-predictor-55f969cd7b

SuccessfulCreate

Created pod: success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

kserve-ci-e2e-test

deployment-controller

success-200-isvc-f6018-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-f6018-predictor-55f969cd7b from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Started

Started container kserve-container

kserve-ci-e2e-test

multus

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

AddedInterface

Add eth0 [10.132.0.39/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-d0d22-predictor-65f777864b-fb2rr

Unhealthy

Readiness probe failed: Get "https://10.132.0.33:8643/healthz": dial tcp 10.132.0.33:8643: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c68dd-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c68dd-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c68dd-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c68dd-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Unhealthy

Readiness probe failed: dial tcp 10.132.0.36:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c68dd

InferenceServiceReady

InferenceService [error-404-isvc-c68dd] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-c68dd

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c68dd

InferenceServiceReady

InferenceService [success-200-isvc-c68dd] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-c68dd

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-d0d22-77bb574b5d-lkw4b

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f6018-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f6018-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f6018-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f6018-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-c68dd-c6797d99d

SuccessfulCreate

Created pod: ensemble-graph-c68dd-c6797d99d-xsj96

kserve-ci-e2e-test

deployment-controller

ensemble-graph-c68dd

ScalingReplicaSet

Scaled up replica set ensemble-graph-c68dd-c6797d99d from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-c68dd

InferenceGraphReady

InferenceGraph [ensemble-graph-c68dd] is Ready

kserve-ci-e2e-test

kubelet

ensemble-graph-c68dd-c6797d99d-xsj96

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

kubelet

ensemble-graph-c68dd-c6797d99d-xsj96

Created

Created container: ensemble-graph-c68dd

kserve-ci-e2e-test

kubelet

ensemble-graph-c68dd-c6797d99d-xsj96

Started

Started container ensemble-graph-c68dd

kserve-ci-e2e-test

multus

ensemble-graph-c68dd-c6797d99d-xsj96

AddedInterface

Add eth0 [10.132.0.41/23] from ovn-kubernetes
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Unhealthy

Readiness probe failed: dial tcp 10.132.0.39:8080: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c68dd-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-c68dd-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f6018

InferenceServiceReady

InferenceService [error-404-isvc-f6018] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-f6018

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-c68dd

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f6018

InferenceServiceReady

InferenceService [success-200-isvc-f6018] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-f6018

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-c68dd

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c68dd-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-c68dd-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f6018-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-f6018-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

sequence-graph-f6018-777885f489-bxxmg

Started

Started container sequence-graph-f6018

kserve-ci-e2e-test

kubelet

sequence-graph-f6018-777885f489-bxxmg

Created

Created container: sequence-graph-f6018

kserve-ci-e2e-test

kubelet

sequence-graph-f6018-777885f489-bxxmg

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

multus

sequence-graph-f6018-777885f489-bxxmg

AddedInterface

Add eth0 [10.132.0.42/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

sequence-graph-f6018-777885f489

SuccessfulCreate

Created pod: sequence-graph-f6018-777885f489-bxxmg

kserve-ci-e2e-test

deployment-controller

sequence-graph-f6018

ScalingReplicaSet

Scaled up replica set sequence-graph-f6018-777885f489 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-f6018

InferenceGraphReady

InferenceGraph [sequence-graph-f6018] is Ready

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-c68dd

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-c68dd

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f6018-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-f6018-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-f6018

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-f6018

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

ensemble-graph-c68dd-c6797d99d-xsj96

Killing

Stopping container ensemble-graph-c68dd

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-41c4d-predictor-56f84dd86b

SuccessfulCreate

Created pod: success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

kserve-ci-e2e-test

deployment-controller

success-200-isvc-41c4d-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-41c4d-predictor-56f84dd86b from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-41c4d-predictor-ffc88656f

SuccessfulCreate

Created pod: error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

error-404-isvc-41c4d-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-41c4d-predictor-ffc88656f from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-41c4d

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-41c4d": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-41c4d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-41c4d

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-41c4d": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

AddedInterface

Add eth0 [10.132.0.43/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Created

Created container: kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

AddedInterface

Add eth0 [10.132.0.44/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Unhealthy

Readiness probe failed: Get "https://10.132.0.37:8643/healthz": dial tcp 10.132.0.37:8643: connect: connection refused
(x8)

kserve-ci-e2e-test

kubelet

error-404-isvc-c68dd-predictor-9956c4bc9-5pwwq

Unhealthy

Readiness probe failed: dial tcp 10.132.0.37:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Unhealthy

Readiness probe failed: Get "https://10.132.0.36:8643/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)

kserve-ci-e2e-test

kubelet

success-200-isvc-c68dd-predictor-57f664bd48-xzcc6

Unhealthy

Readiness probe failed: dial tcp 10.132.0.36:8080: i/o timeout

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-41c4d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-41c4d-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-6941b-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-6941b-predictor-79dfb7969c from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-6941b-predictor-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-6941b-predictor-5648f5ddc4

SuccessfulCreate

Created pod: success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6941b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-6941b-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-6941b-predictor-5648f5ddc4 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6941b

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-6941b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-6941b-predictor-79dfb7969c

SuccessfulCreate

Created pod: error-404-isvc-6941b-predictor-79dfb7969c-rn424

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-6941b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

sequence-graph-f6018-777885f489-bxxmg

Killing

Stopping container sequence-graph-f6018

kserve-ci-e2e-test

kubelet

success-200-isvc-f6018-predictor-55f969cd7b-vv7hw

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-6941b-predictor-serving-cert" not found

kserve-ci-e2e-test

multus

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

AddedInterface

Add eth0 [10.132.0.45/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Created

Created container: kserve-container
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-c68dd-c6797d99d-xsj96

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-6941b-predictor-79dfb7969c-rn424

AddedInterface

Add eth0 [10.132.0.46/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Unhealthy

Readiness probe failed: Get "https://10.132.0.40:8643/healthz": dial tcp 10.132.0.40:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-f6018-predictor-6c99ffbf99-pllwl

Unhealthy

Readiness probe failed: dial tcp 10.132.0.40:8080: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-41c4d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-41c4d-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Unhealthy

Readiness probe failed: dial tcp 10.132.0.43:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Unhealthy

Readiness probe failed: dial tcp 10.132.0.44:8080: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-41c4d

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-41c4d

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-41c4d

InferenceServiceReady

InferenceService [error-404-isvc-41c4d] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-41c4d

InferenceServiceReady

InferenceService [success-200-isvc-41c4d] is Ready
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-f6018-777885f489-bxxmg

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

replicaset-controller

splitter-graph-41c4d-654b49775c

SuccessfulCreate

Created pod: splitter-graph-41c4d-654b49775c-n9pd5
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-41c4d-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-41c4d

InferenceGraphReady

InferenceGraph [splitter-graph-41c4d] is Ready

kserve-ci-e2e-test

deployment-controller

splitter-graph-41c4d

ScalingReplicaSet

Scaled up replica set splitter-graph-41c4d-654b49775c from 0 to 1
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-41c4d-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-41c4d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-41c4d-654b49775c-n9pd5

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "splitter-graph-41c4d-serving-cert" not found
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-41c4d-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-41c4d-654b49775c-n9pd5

Started

Started container splitter-graph-41c4d

kserve-ci-e2e-test

kubelet

splitter-graph-41c4d-654b49775c-n9pd5

Created

Created container: splitter-graph-41c4d

kserve-ci-e2e-test

kubelet

splitter-graph-41c4d-654b49775c-n9pd5

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

multus

splitter-graph-41c4d-654b49775c-n9pd5

AddedInterface

Add eth0 [10.132.0.47/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

splitter-graph-41c4d-654b49775c-n9pd5

Killing

Stopping container splitter-graph-41c4d

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7afa6

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-7afa6": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-7afa6-predictor-7549c68964

SuccessfulCreate

Created pod: success-200-isvc-7afa6-predictor-7549c68964-lvtm5

kserve-ci-e2e-test

deployment-controller

success-200-isvc-7afa6-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-7afa6-predictor-7549c68964 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7afa6

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-7afa6": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-7afa6": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7afa6

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-7afa6": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-7afa6-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-7afa6-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7afa6

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-7afa6": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-7afa6": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-7afa6-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-7afa6-predictor-5f65ffc855 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-7afa6-predictor-5f65ffc855

SuccessfulCreate

Created pod: error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1293" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Created

Created container: kserve-container

kserve-ci-e2e-test

multus

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

AddedInterface

Add eth0 [10.132.0.48/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Started

Started container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

AddedInterface

Add eth0 [10.132.0.49/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-41c4d-predictor-56f84dd86b-fzdzp

Unhealthy

Readiness probe failed: Get "https://10.132.0.43:8643/healthz": dial tcp 10.132.0.43:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Unhealthy

Readiness probe failed: dial tcp 10.132.0.45:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Unhealthy

Readiness probe failed: dial tcp 10.132.0.46:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-41c4d-predictor-ffc88656f-4gvqb

Unhealthy

Readiness probe failed: Get "https://10.132.0.44:8643/healthz": dial tcp 10.132.0.44:8643: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6941b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

InferenceServiceReady

InferenceService [success-200-isvc-6941b] is Ready

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6941b

InferenceServiceReady

InferenceService [error-404-isvc-6941b] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

kubelet

switch-graph-6941b-869c44b684-24nwx

Started

Started container switch-graph-6941b

kserve-ci-e2e-test

kubelet

switch-graph-6941b-869c44b684-24nwx

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

multus

switch-graph-6941b-869c44b684-24nwx

AddedInterface

Add eth0 [10.132.0.50/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

switch-graph-6941b-869c44b684

SuccessfulCreate

Created pod: switch-graph-6941b-869c44b684-24nwx

kserve-ci-e2e-test

kubelet

switch-graph-6941b-869c44b684-24nwx

Created

Created container: switch-graph-6941b

kserve-ci-e2e-test

deployment-controller

switch-graph-6941b

ScalingReplicaSet

Scaled up replica set switch-graph-6941b-869c44b684 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

switch-graph-6941b

InferenceGraphReady

InferenceGraph [switch-graph-6941b] is Ready
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-41c4d-654b49775c-n9pd5

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7afa6-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7afa6-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7afa6-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7afa6-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Unhealthy

Readiness probe failed: dial tcp 10.132.0.49:8080: connect: connection refused

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-6941b

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-6941b

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Unhealthy

Readiness probe failed: dial tcp 10.132.0.48:8080: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7afa6

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-7afa6

InferenceServiceReady

InferenceService [error-404-isvc-7afa6] is Ready

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-6941b

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-6941b

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7afa6

InferenceServiceReady

InferenceService [success-200-isvc-7afa6] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-7afa6

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

multus

splitter-graph-7afa6-7fb6d7495d-5g5l4

AddedInterface

Add eth0 [10.132.0.51/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

splitter-graph-7afa6-7fb6d7495d-5g5l4

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:f452095929b339c4acc2639af4bf7c4693d3269c9b093ed79b1f54101a2de422" already present on machine

kserve-ci-e2e-test

kubelet

splitter-graph-7afa6-7fb6d7495d-5g5l4

Created

Created container: splitter-graph-7afa6

kserve-ci-e2e-test

kubelet

splitter-graph-7afa6-7fb6d7495d-5g5l4

Started

Started container splitter-graph-7afa6

kserve-ci-e2e-test

replicaset-controller

splitter-graph-7afa6-7fb6d7495d

SuccessfulCreate

Created pod: splitter-graph-7afa6-7fb6d7495d-5g5l4

kserve-ci-e2e-test

deployment-controller

splitter-graph-7afa6

ScalingReplicaSet

Scaled up replica set splitter-graph-7afa6-7fb6d7495d from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-7afa6

InferenceGraphReady

InferenceGraph [splitter-graph-7afa6] is Ready
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7afa6-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-7afa6-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7afa6-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-7afa6-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-7afa6

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-7afa6

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-7afa6

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-7afa6

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-7afa6-7fb6d7495d-5g5l4

Killing

Stopping container splitter-graph-7afa6

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-7afa6-predictor-5f65ffc855-lhn82

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-7afa6-predictor-7549c68964-lvtm5

Unhealthy

Readiness probe failed: Get "https://10.132.0.48:8643/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-7afa6-7fb6d7495d-5g5l4

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

switch-graph-6941b-869c44b684-24nwx

Killing

Stopping container switch-graph-6941b

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-5648f5ddc4-hkwmz

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-79dfb7969c-rn424

Unhealthy

Readiness probe failed: Get "https://10.132.0.46:8643/healthz": dial tcp 10.132.0.46:8643: connect: connection refused
(x5)

kserve-ci-e2e-test

kubelet

switch-graph-6941b-869c44b684-24nwx

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503