Time Namespace Component RelatedObject Reason Message

kserve-ci-e2e-test

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5 to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9 to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

switch-graph-9db4b-8484996d65-m4klk

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

switch-graph-6d3ba-548c7746dd-xgm2g

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9 to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw to ip-10-0-130-31.ec2.internal

kserve-ci-e2e-test

switch-graph-fc125-5c9566c648-xg996

Scheduled

Successfully assigned kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

sequence-graph-54155-84d8774cb-sx4nk

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65 to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

ensemble-graph-a9fd5-7b8774d5b4-6s425

Scheduled

Successfully assigned kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425 to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

error-404-isvc-54155-predictor-db578f77d-7fmtl

Scheduled

Successfully assigned kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

model-chainer-85c5fc8d94-dvnrr

Scheduled

Successfully assigned kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

sequence-graph-6445b-696c86c896-8k2tz

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

sequence-graph-d1134-69b44c49f9-tm66r

Scheduled

Successfully assigned kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

splitter-graph-6842b-55bb6cfccf-g8lvd

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

splitter-graph-6941b-78cf747fb6-bxvdp

Scheduled

Successfully assigned kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp to ip-10-0-136-201.ec2.internal

kserve-ci-e2e-test

success-200-isvc-54155-predictor-744c89d589-bgsvq

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Scheduled

Successfully assigned kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf to ip-10-0-132-124.ec2.internal

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-9db4b-predictor-5f9575b58c

SuccessfulCreate

Created pod: success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

kserve-ci-e2e-test

deployment-controller

success-200-isvc-9db4b-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-9db4b-predictor-5f9575b58c from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9db4b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-9db4b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9db4b

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-9db4b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-9db4b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-1-predictor-58d89b86bf

SuccessfulCreate

Created pod: isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

kserve-ci-e2e-test

deployment-controller

error-404-isvc-9db4b-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-9db4b-predictor-7ffc8c88fb from 0 to 1

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-1-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-1-predictor-58d89b86bf from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Pulling

Pulling image "quay.io/opendatahub/success-200-isvc:odh-pr-1450"

kserve-ci-e2e-test

multus

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

AddedInterface

Add eth0 [10.134.0.23/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

isvc-xgboost-graph-predictor

ScalingReplicaSet

Scaled up replica set isvc-xgboost-graph-predictor-669d8d6456 from 0 to 1

kserve-ci-e2e-test

multus

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

AddedInterface

Add eth0 [10.134.0.24/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0dd2798f9cdbffc0563bd148612201df0e589ff8c26a6b19d321fd120fc5c097"

kserve-ci-e2e-test

replicaset-controller

isvc-xgboost-graph-predictor-669d8d6456

SuccessfulCreate

Created pod: isvc-xgboost-graph-predictor-669d8d6456-n7ckc

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0dd2798f9cdbffc0563bd148612201df0e589ff8c26a6b19d321fd120fc5c097"

kserve-ci-e2e-test

multus

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

AddedInterface

Add eth0 [10.132.0.23/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-9db4b-predictor-7ffc8c88fb

SuccessfulCreate

Created pod: error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-2-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-2-predictor-58cc99f6ff from 0 to 1

kserve-ci-e2e-test

multus

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

AddedInterface

Add eth0 [10.134.0.25/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-2-predictor-58cc99f6ff

SuccessfulCreate

Created pod: isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:0dd2798f9cdbffc0563bd148612201df0e589ff8c26a6b19d321fd120fc5c097"

kserve-ci-e2e-test

multus

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

AddedInterface

Add eth0 [10.133.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Pulling

Pulling image "quay.io/opendatahub/error-404-isvc:odh-pr-1450"

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0dd2798f9cdbffc0563bd148612201df0e589ff8c26a6b19d321fd120fc5c097" in 4.174s (4.174s including waiting). Image size: 299845049 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0dd2798f9cdbffc0563bd148612201df0e589ff8c26a6b19d321fd120fc5c097" in 3.387s (3.387s including waiting). Image size: 299845049 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Pulling

Pulling image "kserve/xgbserver:latest"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1450"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Pulled

Successfully pulled image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" in 16.386s (16.386s including waiting). Image size: 1334822504 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Pulled

Successfully pulled image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" in 17.051s (17.051s including waiting). Image size: 1335723119 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:0dd2798f9cdbffc0563bd148612201df0e589ff8c26a6b19d321fd120fc5c097" in 16.633s (16.633s including waiting). Image size: 299845049 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.416s (2.416s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.421s (2.421s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1450"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1450" in 13.373s (13.373s including waiting). Image size: 1560926126 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.222s (2.222s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Pulled

Successfully pulled image "kserve/xgbserver:latest" in 19.626s (19.626s including waiting). Image size: 1306417402 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1450" in 6.142s (6.142s including waiting). Image size: 1560926126 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9db4b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9db4b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.818s (2.818s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Created

Created container: kube-rbac-proxy
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9db4b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9db4b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Unhealthy

Readiness probe failed: dial tcp 10.134.0.23:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Unhealthy

Readiness probe failed: dial tcp 10.134.0.25:8080: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9db4b

InferenceServiceReady

InferenceService [success-200-isvc-9db4b] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-9db4b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-9db4b

InferenceServiceReady

InferenceService [error-404-isvc-9db4b] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-9db4b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9db4b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-9db4b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

switch-graph-9db4b-8484996d65

SuccessfulCreate

Created pod: switch-graph-9db4b-8484996d65-m4klk

kserve-ci-e2e-test

deployment-controller

switch-graph-9db4b

ScalingReplicaSet

Scaled up replica set switch-graph-9db4b-8484996d65 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

switch-graph-9db4b

InferenceGraphReady

InferenceGraph [switch-graph-9db4b] is Ready

kserve-ci-e2e-test

kubelet

switch-graph-9db4b-8484996d65-m4klk

Pulling

Pulling image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb"

kserve-ci-e2e-test

multus

switch-graph-9db4b-8484996d65-m4klk

AddedInterface

Add eth0 [10.132.0.24/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-9db4b-8484996d65-m4klk

Created

Created container: switch-graph-9db4b

kserve-ci-e2e-test

kubelet

switch-graph-9db4b-8484996d65-m4klk

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" in 1.98s (1.98s including waiting). Image size: 216346202 bytes.

kserve-ci-e2e-test

kubelet

switch-graph-9db4b-8484996d65-m4klk

Started

Started container switch-graph-9db4b
(x8)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Unhealthy

Readiness probe failed: dial tcp 10.134.0.24:8080: connect: connection refused
(x8)

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Unhealthy

Readiness probe failed: dial tcp 10.132.0.23:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

AddedInterface

Add eth0 [10.134.0.26/23] from ovn-kubernetes

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6d3ba

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6d3ba": the object has been modified; please apply your changes to the latest version and try again
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9db4b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

switch-graph-9db4b-8484996d65-m4klk

Killing

Stopping container switch-graph-9db4b

kserve-ci-e2e-test

deployment-controller

error-404-isvc-6d3ba-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-6d3ba-predictor-5d779b8c86 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-6d3ba-predictor-5d779b8c86

SuccessfulCreate

Created pod: error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6d3ba

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6d3ba": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6d3ba

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-6d3ba": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6d3ba": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Created

Created container: kserve-container

kserve-ci-e2e-test

deployment-controller

success-200-isvc-6d3ba-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-6d3ba-predictor-5578786d5c from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-6d3ba-predictor-5578786d5c

SuccessfulCreate

Created pod: success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6d3ba

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-6d3ba": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6d3ba": the object has been modified; please apply your changes to the latest version and try again
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-9db4b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

multus

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

AddedInterface

Add eth0 [10.134.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9

Unhealthy

Readiness probe failed: Get "https://10.134.0.23:8643/healthz": dial tcp 10.134.0.23:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

InferenceServiceReady

InferenceService [isvc-sklearn-graph-2] is Ready
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

kubelet

error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r

Unhealthy

Readiness probe failed: Get "https://10.134.0.25:8643/healthz": dial tcp 10.134.0.25:8643: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

InferenceServiceReady

InferenceService [isvc-sklearn-graph-1] is Ready
(x10)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-1

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

InferenceServiceReady

InferenceService [isvc-xgboost-graph] is Ready
(x9)

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

model-chainer

ScalingReplicaSet

Scaled up replica set model-chainer-85c5fc8d94 from 0 to 1
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-9db4b-8484996d65-m4klk

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

replicaset-controller

model-chainer-85c5fc8d94

SuccessfulCreate

Created pod: model-chainer-85c5fc8d94-dvnrr
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

model-chainer-85c5fc8d94-dvnrr

Started

Started container model-chainer

kserve-ci-e2e-test

InferenceGraphController

model-chainer

InferenceGraphReady

InferenceGraph [model-chainer] is Ready

kserve-ci-e2e-test

kubelet

model-chainer-85c5fc8d94-dvnrr

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

multus

model-chainer-85c5fc8d94-dvnrr

AddedInterface

Add eth0 [10.132.0.25/23] from ovn-kubernetes
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6d3ba-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6d3ba-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

model-chainer-85c5fc8d94-dvnrr

Created

Created container: model-chainer
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-1-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

error-404-isvc-54155-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-54155-predictor-db578f77d from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-54155

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-54155": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-54155-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-54155-predictor-744c89d589 from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-54155

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-54155": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-54155": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-54155-predictor-744c89d589

SuccessfulCreate

Created pod: success-200-isvc-54155-predictor-744c89d589-bgsvq

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-54155-predictor-db578f77d

SuccessfulCreate

Created pod: error-404-isvc-54155-predictor-db578f77d-7fmtl

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Unhealthy

Readiness probe failed: Get "https://10.134.0.24:8643/healthz": dial tcp 10.134.0.24:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-54155-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

model-chainer-85c5fc8d94-dvnrr

Killing

Stopping container model-chainer

kserve-ci-e2e-test

multus

error-404-isvc-54155-predictor-db578f77d-7fmtl

AddedInterface

Add eth0 [10.134.0.29/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-54155-predictor-744c89d589-bgsvq

AddedInterface

Add eth0 [10.134.0.28/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Started

Started container kube-rbac-proxy
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Unhealthy

Readiness probe failed: dial tcp 10.133.0.27:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw

Unhealthy

Readiness probe failed: Get "https://10.133.0.27:8643/healthz": dial tcp 10.133.0.27:8643: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6d3ba-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6d3ba-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-predictor-669d8d6456-n7ckc

Unhealthy

Readiness probe failed: Get "https://10.132.0.23:8643/healthz": context deadline exceeded

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6d3ba

InferenceServiceReady

InferenceService [success-200-isvc-6d3ba] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6d3ba

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6d3ba

InferenceServiceReady

InferenceService [error-404-isvc-6d3ba] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6d3ba

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-54155-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-54155-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

switch-graph-6d3ba-548c7746dd-xgm2g

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

kubelet

switch-graph-6d3ba-548c7746dd-xgm2g

Started

Started container switch-graph-6d3ba

kserve-ci-e2e-test

InferenceGraphController

switch-graph-6d3ba

InferenceGraphReady

InferenceGraph [switch-graph-6d3ba] is Ready

kserve-ci-e2e-test

deployment-controller

switch-graph-6d3ba

ScalingReplicaSet

Scaled up replica set switch-graph-6d3ba-548c7746dd from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-6d3ba-548c7746dd

SuccessfulCreate

Created pod: switch-graph-6d3ba-548c7746dd-xgm2g

kserve-ci-e2e-test

kubelet

switch-graph-6d3ba-548c7746dd-xgm2g

Created

Created container: switch-graph-6d3ba

kserve-ci-e2e-test

multus

switch-graph-6d3ba-548c7746dd-xgm2g

AddedInterface

Add eth0 [10.132.0.26/23] from ovn-kubernetes

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6d3ba-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6d3ba-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

kubelet

model-chainer-85c5fc8d94-dvnrr

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6d3ba-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6d3ba-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-54155-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-54155-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-54155

InferenceServiceReady

InferenceService [error-404-isvc-54155] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-54155

InferenceServiceReady

InferenceService [success-200-isvc-54155] is Ready
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-54155

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-54155

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-6d3ba

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-6d3ba

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

multus

sequence-graph-54155-84d8774cb-sx4nk

AddedInterface

Add eth0 [10.132.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-54155-84d8774cb-sx4nk

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-54155

InferenceGraphReady

InferenceGraph [sequence-graph-54155] is Ready

kserve-ci-e2e-test

deployment-controller

sequence-graph-54155

ScalingReplicaSet

Scaled up replica set sequence-graph-54155-84d8774cb from 0 to 1

kserve-ci-e2e-test

replicaset-controller

sequence-graph-54155-84d8774cb

SuccessfulCreate

Created pod: sequence-graph-54155-84d8774cb-sx4nk

kserve-ci-e2e-test

kubelet

sequence-graph-54155-84d8774cb-sx4nk

Started

Started container sequence-graph-54155

kserve-ci-e2e-test

kubelet

sequence-graph-54155-84d8774cb-sx4nk

Created

Created container: sequence-graph-54155

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-54155

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-54155-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-54155-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-54155

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-54155-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-54155-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-54155

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-54155

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

AddedInterface

Add eth0 [10.134.0.30/23] from ovn-kubernetes

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-dd3fc

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-dd3fc": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-dd3fc": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-dd3fc

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-dd3fc": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Killing

Stopping container kserve-container

kserve-ci-e2e-test

multus

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

AddedInterface

Add eth0 [10.134.0.31/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

switch-graph-6d3ba-548c7746dd-xgm2g

Killing

Stopping container switch-graph-6d3ba

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-dd3fc

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-dd3fc": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-dd3fc": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-dd3fc

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-dd3fc": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-dd3fc-predictor-7dd9d668df

SuccessfulCreate

Created pod: success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

kserve-ci-e2e-test

deployment-controller

success-200-isvc-dd3fc-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-dd3fc-predictor-7dd9d668df from 0 to 1

kserve-ci-e2e-test

deployment-controller

error-404-isvc-dd3fc-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-dd3fc-predictor-5bbf46867d from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-dd3fc-predictor-5bbf46867d

SuccessfulCreate

Created pod: error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Unhealthy

Readiness probe failed: Get "https://10.134.0.26:8643/healthz": dial tcp 10.134.0.26:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Unhealthy

Readiness probe failed: dial tcp 10.134.0.27:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt

Unhealthy

Readiness probe failed: Get "https://10.134.0.27:8643/healthz": dial tcp 10.134.0.27:8643: connect: connection refused
(x8)

kserve-ci-e2e-test

kubelet

success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc

Unhealthy

Readiness probe failed: dial tcp 10.134.0.26:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

switch-graph-6d3ba-548c7746dd-xgm2g

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-dd3fc-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-dd3fc-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-dd3fc-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-dd3fc-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Unhealthy

Readiness probe failed: dial tcp 10.134.0.30:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Unhealthy

Readiness probe failed: dial tcp 10.134.0.31:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6445b

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-6445b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6445b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-6445b-predictor-566b89cd56

SuccessfulCreate

Created pod: error-404-isvc-6445b-predictor-566b89cd56-67vrw

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

deployment-controller

error-404-isvc-6445b-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-6445b-predictor-566b89cd56 from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-6445b-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

sequence-graph-54155-84d8774cb-sx4nk

Killing

Stopping container sequence-graph-54155

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6445b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6445b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6445b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6445b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-6445b-predictor-566b89cd56-67vrw

AddedInterface

Add eth0 [10.134.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6445b

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-6445b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6445b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-6445b-predictor-769c496d67

SuccessfulCreate

Created pod: success-200-isvc-6445b-predictor-769c496d67-hc2tf

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

success-200-isvc-6445b-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-6445b-predictor-769c496d67 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-6445b-predictor-769c496d67-hc2tf

AddedInterface

Add eth0 [10.134.0.32/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Unhealthy

Readiness probe failed: Get "https://10.134.0.28:8643/healthz": dial tcp 10.134.0.28:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Unhealthy

Readiness probe failed: Get "https://10.134.0.29:8643/healthz": dial tcp 10.134.0.29:8643: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-54155-predictor-744c89d589-bgsvq

Unhealthy

Readiness probe failed: dial tcp 10.134.0.28:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-54155-predictor-db578f77d-7fmtl

Unhealthy

Readiness probe failed: dial tcp 10.134.0.29:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-dd3fc

InferenceServiceReady

InferenceService [error-404-isvc-dd3fc] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-dd3fc

InferenceServiceReady

InferenceService [success-200-isvc-dd3fc] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-dd3fc

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-dd3fc

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-dd3fc-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-dd3fc

InferenceGraphReady

InferenceGraph [ensemble-graph-dd3fc] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-dd3fc-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-dd3fc-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "ensemble-graph-dd3fc-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

ensemble-graph-dd3fc

ScalingReplicaSet

Scaled up replica set ensemble-graph-dd3fc-6bb9dd7f4b from 0 to 1

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-dd3fc-6bb9dd7f4b

SuccessfulCreate

Created pod: ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-dd3fc-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

Created

Created container: ensemble-graph-dd3fc

kserve-ci-e2e-test

kubelet

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

kubelet

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

Started

Started container ensemble-graph-dd3fc

kserve-ci-e2e-test

multus

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

AddedInterface

Add eth0 [10.132.0.28/23] from ovn-kubernetes
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-54155-84d8774cb-sx4nk

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-a9fd5-predictor-864584f7cf

SuccessfulCreate

Created pod: success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a9fd5

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-a9fd5": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-a9fd5": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

success-200-isvc-a9fd5-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-a9fd5-predictor-864584f7cf from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-a9fd5-predictor-serving-cert" not found

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a9fd5

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-a9fd5": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

Killing

Stopping container ensemble-graph-dd3fc

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-a9fd5-predictor-869bc478f6

SuccessfulCreate

Created pod: error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

kserve-ci-e2e-test

deployment-controller

error-404-isvc-a9fd5-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-a9fd5-predictor-869bc478f6 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a9fd5

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-a9fd5": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-a9fd5": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a9fd5

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-a9fd5": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

AddedInterface

Add eth0 [10.134.0.34/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

AddedInterface

Add eth0 [10.134.0.35/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt

Unhealthy

Readiness probe failed: Get "https://10.134.0.31:8643/healthz": dial tcp 10.134.0.31:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc

Unhealthy

Readiness probe failed: Get "https://10.134.0.30:8643/healthz": dial tcp 10.134.0.30:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Unhealthy

Readiness probe failed: dial tcp 10.134.0.33:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Unhealthy

Readiness probe failed: dial tcp 10.134.0.32:8080: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6445b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a9fd5-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6445b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a9fd5-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6445b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6445b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6445b

InferenceServiceReady

InferenceService [error-404-isvc-6445b] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6445b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6445b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6445b

InferenceServiceReady

InferenceService [success-200-isvc-6445b] is Ready
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6445b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6445b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

sequence-graph-6445b-696c86c896

SuccessfulCreate

Created pod: sequence-graph-6445b-696c86c896-8k2tz

kserve-ci-e2e-test

kubelet

sequence-graph-6445b-696c86c896-8k2tz

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "sequence-graph-6445b-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

sequence-graph-6445b

ScalingReplicaSet

Scaled up replica set sequence-graph-6445b-696c86c896 from 0 to 1
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6445b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-6445b

InferenceGraphReady

InferenceGraph [sequence-graph-6445b] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6445b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

sequence-graph-6445b-696c86c896-8k2tz

Created

Created container: sequence-graph-6445b

kserve-ci-e2e-test

kubelet

sequence-graph-6445b-696c86c896-8k2tz

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

multus

sequence-graph-6445b-696c86c896-8k2tz

AddedInterface

Add eth0 [10.132.0.29/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-6445b-696c86c896-8k2tz

Started

Started container sequence-graph-6445b
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Unhealthy

Readiness probe failed: dial tcp 10.134.0.34:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Unhealthy

Readiness probe failed: dial tcp 10.134.0.35:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d1134

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-d1134": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-d1134": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

error-404-isvc-d1134-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-d1134-predictor-796fc776b4 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

sequence-graph-6445b-696c86c896-8k2tz

Killing

Stopping container sequence-graph-6445b

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-d1134-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d1134

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-d1134": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d1134

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-d1134": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-d1134": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Created

Created container: kserve-container

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-d1134-predictor-b6cdf5487

SuccessfulCreate

Created pod: success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

AddedInterface

Add eth0 [10.134.0.36/23] from ovn-kubernetes

kserve-ci-e2e-test

deployment-controller

success-200-isvc-d1134-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-d1134-predictor-b6cdf5487 from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d1134

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-d1134": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-d1134-predictor-796fc776b4

SuccessfulCreate

Created pod: error-404-isvc-d1134-predictor-796fc776b4-5ssqh

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-6445b-predictor-566b89cd56-67vrw

Unhealthy

Readiness probe failed: Get "https://10.134.0.33:8643/healthz": dial tcp 10.134.0.33:8643: connect: connection refused

kserve-ci-e2e-test

multus

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

AddedInterface

Add eth0 [10.134.0.37/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6445b-predictor-769c496d67-hc2tf

Unhealthy

Readiness probe failed: Get "https://10.134.0.32:8643/healthz": dial tcp 10.134.0.32:8643: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a9fd5-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a9fd5-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a9fd5

InferenceServiceReady

InferenceService [error-404-isvc-a9fd5] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-a9fd5

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a9fd5

InferenceServiceReady

InferenceService [success-200-isvc-a9fd5] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-a9fd5

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

replicaset-controller

ensemble-graph-a9fd5-7b8774d5b4

SuccessfulCreate

Created pod: ensemble-graph-a9fd5-7b8774d5b4-6s425

kserve-ci-e2e-test

InferenceGraphController

ensemble-graph-a9fd5

InferenceGraphReady

InferenceGraph [ensemble-graph-a9fd5] is Ready

kserve-ci-e2e-test

deployment-controller

ensemble-graph-a9fd5

ScalingReplicaSet

Scaled up replica set ensemble-graph-a9fd5-7b8774d5b4 from 0 to 1
(x2)

kserve-ci-e2e-test

kubelet

ensemble-graph-a9fd5-7b8774d5b4-6s425

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "ensemble-graph-a9fd5-serving-cert" not found

kserve-ci-e2e-test

kubelet

ensemble-graph-a9fd5-7b8774d5b4-6s425

Started

Started container ensemble-graph-a9fd5

kserve-ci-e2e-test

kubelet

ensemble-graph-a9fd5-7b8774d5b4-6s425

Created

Created container: ensemble-graph-a9fd5

kserve-ci-e2e-test

kubelet

ensemble-graph-a9fd5-7b8774d5b4-6s425

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

multus

ensemble-graph-a9fd5-7b8774d5b4-6s425

AddedInterface

Add eth0 [10.132.0.30/23] from ovn-kubernetes
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-6445b-696c86c896-8k2tz

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d1134-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d1134-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a9fd5-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-a9fd5-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d1134-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d1134-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d1134

InferenceServiceReady

InferenceService [error-404-isvc-d1134] is Ready

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d1134

InferenceServiceReady

InferenceService [success-200-isvc-d1134] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-d1134

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-d1134

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a9fd5-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-a9fd5-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

sequence-graph-d1134-69b44c49f9-tm66r

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "sequence-graph-d1134-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

sequence-graph-d1134-69b44c49f9

SuccessfulCreate

Created pod: sequence-graph-d1134-69b44c49f9-tm66r

kserve-ci-e2e-test

deployment-controller

sequence-graph-d1134

ScalingReplicaSet

Scaled up replica set sequence-graph-d1134-69b44c49f9 from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

sequence-graph-d1134

InferenceGraphReady

InferenceGraph [sequence-graph-d1134] is Ready

kserve-ci-e2e-test

multus

sequence-graph-d1134-69b44c49f9-tm66r

AddedInterface

Add eth0 [10.132.0.31/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

sequence-graph-d1134-69b44c49f9-tm66r

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

kubelet

sequence-graph-d1134-69b44c49f9-tm66r

Created

Created container: sequence-graph-d1134

kserve-ci-e2e-test

kubelet

sequence-graph-d1134-69b44c49f9-tm66r

Started

Started container sequence-graph-d1134
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-a9fd5

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

ensemble-graph-a9fd5

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d1134-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-d1134-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d1134-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-d1134-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-d1134

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

sequence-graph-d1134

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6842b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6842b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6842b

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-6842b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6842b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6842b

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-6842b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-6842b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

ensemble-graph-a9fd5-7b8774d5b4-6s425

Killing

Stopping container ensemble-graph-a9fd5

kserve-ci-e2e-test

kubelet

error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-6842b-predictor-7bd4c8d8df

SuccessfulCreate

Created pod: success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

kserve-ci-e2e-test

deployment-controller

success-200-isvc-6842b-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-6842b-predictor-7bd4c8d8df from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6842b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6842b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-6842b-predictor-565dd576f8

SuccessfulCreate

Created pod: error-404-isvc-6842b-predictor-565dd576f8-rwz65

kserve-ci-e2e-test

deployment-controller

error-404-isvc-6842b-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-6842b-predictor-565dd576f8 from 0 to 1

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

AddedInterface

Add eth0 [10.134.0.38/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

multus

error-404-isvc-6842b-predictor-565dd576f8-rwz65

AddedInterface

Add eth0 [10.134.0.39/23] from ovn-kubernetes

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6842b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6842b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

kubelet

ensemble-graph-a9fd5-7b8774d5b4-6s425

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Unhealthy

Readiness probe failed: dial tcp 10.134.0.38:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Unhealthy

Readiness probe failed: dial tcp 10.134.0.39:8080: connect: connection refused

kserve-ci-e2e-test

deployment-controller

success-200-isvc-fc125-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-fc125-predictor-67497cd975 from 0 to 1

kserve-ci-e2e-test

multus

success-200-isvc-fc125-predictor-67497cd975-mrbkj

AddedInterface

Add eth0 [10.134.0.40/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

multus

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

AddedInterface

Add eth0 [10.134.0.41/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

sequence-graph-d1134-69b44c49f9-tm66r

Killing

Stopping container sequence-graph-d1134

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Started

Started container kserve-container

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-fc125-predictor-67497cd975

SuccessfulCreate

Created pod: success-200-isvc-fc125-predictor-67497cd975-mrbkj

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Created

Created container: kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fc125

UpdateFailed

Failed to update status for InferenceService "error-404-isvc-fc125": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-fc125": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Killing

Stopping container kserve-container

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fc125

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "error-404-isvc-fc125": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fc125

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-fc125": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-fc125": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fc125

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-fc125": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Started

Started container kserve-container

kserve-ci-e2e-test

deployment-controller

error-404-isvc-fc125-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-fc125-predictor-7967db9f76 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-fc125-predictor-7967db9f76

SuccessfulCreate

Created pod: error-404-isvc-fc125-predictor-7967db9f76-wrnsk

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine
(x7)

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Unhealthy

Readiness probe failed: dial tcp 10.134.0.37:8080: connect: connection refused
(x7)

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Unhealthy

Readiness probe failed: dial tcp 10.134.0.36:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

error-404-isvc-d1134-predictor-796fc776b4-5ssqh

Unhealthy

Readiness probe failed: Get "https://10.134.0.37:8643/healthz": dial tcp 10.134.0.37:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

success-200-isvc-d1134-predictor-b6cdf5487-xvsr9

Unhealthy

Readiness probe failed: Get "https://10.134.0.36:8643/healthz": dial tcp 10.134.0.36:8643: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6842b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6842b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6842b

InferenceServiceReady

InferenceService [success-200-isvc-6842b] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6842b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6842b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6842b

InferenceServiceReady

InferenceService [error-404-isvc-6842b] is Ready

kserve-ci-e2e-test

replicaset-controller

splitter-graph-6842b-55bb6cfccf

SuccessfulCreate

Created pod: splitter-graph-6842b-55bb6cfccf-g8lvd

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-6842b

InferenceGraphReady

InferenceGraph [splitter-graph-6842b] is Ready

kserve-ci-e2e-test

deployment-controller

splitter-graph-6842b

ScalingReplicaSet

Scaled up replica set splitter-graph-6842b-55bb6cfccf from 0 to 1

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6842b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6842b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6842b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-6842b-55bb6cfccf-g8lvd

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "splitter-graph-6842b-serving-cert" not found
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6842b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

splitter-graph-6842b-55bb6cfccf-g8lvd

Started

Started container splitter-graph-6842b

kserve-ci-e2e-test

kubelet

splitter-graph-6842b-55bb6cfccf-g8lvd

Created

Created container: splitter-graph-6842b

kserve-ci-e2e-test

kubelet

splitter-graph-6842b-55bb6cfccf-g8lvd

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

multus

splitter-graph-6842b-55bb6cfccf-g8lvd

AddedInterface

Add eth0 [10.132.0.32/23] from ovn-kubernetes
(x6)

kserve-ci-e2e-test

kubelet

sequence-graph-d1134-69b44c49f9-tm66r

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

success-200-isvc-6941b-predictor-58bfbfdd8f

SuccessfulCreate

Created pod: success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

kserve-ci-e2e-test

kubelet

splitter-graph-6842b-55bb6cfccf-g8lvd

Killing

Stopping container splitter-graph-6842b

kserve-ci-e2e-test

deployment-controller

error-404-isvc-6941b-predictor

ScalingReplicaSet

Scaled up replica set error-404-isvc-6941b-predictor-db66cc574 from 0 to 1
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fc125-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fc125-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

UpdateFailed

Failed to update status for InferenceService "success-200-isvc-6941b": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "success-200-isvc-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Killing

Stopping container kserve-container

kserve-ci-e2e-test

replicaset-controller

error-404-isvc-6941b-predictor-db66cc574

SuccessfulCreate

Created pod: error-404-isvc-6941b-predictor-db66cc574-zc9wx

kserve-ci-e2e-test

deployment-controller

success-200-isvc-6941b-predictor

ScalingReplicaSet

Scaled up replica set success-200-isvc-6941b-predictor-58bfbfdd8f from 0 to 1

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "success-200-isvc-6941b-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Pulled

Container image "quay.io/opendatahub/success-200-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

multus

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

AddedInterface

Add eth0 [10.134.0.42/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "error-404-isvc-6941b-predictor-serving-cert" not found

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc

Unhealthy

Readiness probe failed: Get "https://10.134.0.38:8643/healthz": dial tcp 10.134.0.38:8643: connect: connection refused

kserve-ci-e2e-test

multus

error-404-isvc-6941b-predictor-db66cc574-zc9wx

AddedInterface

Add eth0 [10.134.0.43/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Pulled

Container image "quay.io/opendatahub/error-404-isvc:odh-pr-1450" already present on machine

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

error-404-isvc-6842b-predictor-565dd576f8-rwz65

Unhealthy

Readiness probe failed: Get "https://10.134.0.39:8643/healthz": dial tcp 10.134.0.39:8643: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Unhealthy

Readiness probe failed: dial tcp 10.134.0.40:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Unhealthy

Readiness probe failed: dial tcp 10.134.0.41:8080: connect: connection refused
(x5)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fc125

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-fc125

InferenceServiceReady

InferenceService [success-200-isvc-fc125] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fc125

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-fc125

InferenceServiceReady

InferenceService [error-404-isvc-fc125] is Ready
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-6842b-55bb6cfccf-g8lvd

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fc125-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

multus

switch-graph-fc125-5c9566c648-xg996

AddedInterface

Add eth0 [10.132.0.33/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

switch-graph-fc125-5c9566c648-xg996

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fc125-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

switch-graph-fc125-5c9566c648-xg996

Created

Created container: switch-graph-fc125

kserve-ci-e2e-test

kubelet

switch-graph-fc125-5c9566c648-xg996

Started

Started container switch-graph-fc125

kserve-ci-e2e-test

InferenceGraphController

switch-graph-fc125

InferenceGraphReady

InferenceGraph [switch-graph-fc125] is Ready

kserve-ci-e2e-test

deployment-controller

switch-graph-fc125

ScalingReplicaSet

Scaled up replica set switch-graph-fc125-5c9566c648 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

switch-graph-fc125-5c9566c648

SuccessfulCreate

Created pod: switch-graph-fc125-5c9566c648-xg996
(x6)

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Unhealthy

Readiness probe failed: dial tcp 10.134.0.42:8080: connect: connection refused
(x6)

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Unhealthy

Readiness probe failed: dial tcp 10.134.0.43:8080: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fc125-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-fc125-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fc125-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-fc125-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-fc125

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-fc125

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

InferenceServiceReady

InferenceService [success-200-isvc-6941b] is Ready
(x5)

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6941b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

error-404-isvc-6941b

InferenceServiceReady

InferenceService [error-404-isvc-6941b] is Ready
(x6)

kserve-ci-e2e-test

v1beta1Controllers

success-200-isvc-6941b

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

deployment-controller

splitter-graph-6941b

ScalingReplicaSet

Scaled up replica set splitter-graph-6941b-78cf747fb6 from 0 to 1

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-fc125

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

switch-graph-fc125

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

replicaset-controller

splitter-graph-6941b-78cf747fb6

SuccessfulCreate

Created pod: splitter-graph-6941b-78cf747fb6-bxvdp

kserve-ci-e2e-test

kubelet

splitter-graph-6941b-78cf747fb6-bxvdp

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "splitter-graph-6941b-serving-cert" not found

kserve-ci-e2e-test

multus

splitter-graph-6941b-78cf747fb6-bxvdp

AddedInterface

Add eth0 [10.132.0.34/23] from ovn-kubernetes

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-6941b

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "splitter-graph-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-6941b

UpdateFailed

Failed to update status for InferenceGraph "splitter-graph-6941b": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "splitter-graph-6941b": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

splitter-graph-6941b

InferenceGraphReady

InferenceGraph [splitter-graph-6941b] is Ready

kserve-ci-e2e-test

kubelet

splitter-graph-6941b-78cf747fb6-bxvdp

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:bfb5b2782772ad46be0a5c41e64aded1dd42f09d6e2e6f1f1c3194223df058cb" already present on machine

kserve-ci-e2e-test

kubelet

splitter-graph-6941b-78cf747fb6-bxvdp

Started

Started container splitter-graph-6941b

kserve-ci-e2e-test

kubelet

splitter-graph-6941b-78cf747fb6-bxvdp

Created

Created container: splitter-graph-6941b

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-6941b

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-6941b

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

success-200-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

error-404-isvc-6941b-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-6941b

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

horizontal-pod-autoscaler

splitter-graph-6941b

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

splitter-graph-6941b-78cf747fb6-bxvdp

Killing

Stopping container splitter-graph-6941b

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-6941b-predictor-db66cc574-zc9wx

Killing

Stopping container kube-rbac-proxy
(x6)

kserve-ci-e2e-test

kubelet

splitter-graph-6941b-78cf747fb6-bxvdp

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

error-404-isvc-fc125-predictor-7967db9f76-wrnsk

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

switch-graph-fc125-5c9566c648-xg996

Killing

Stopping container switch-graph-fc125

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

success-200-isvc-fc125-predictor-67497cd975-mrbkj

Unhealthy

Readiness probe failed: Get "https://10.134.0.40:8643/healthz": dial tcp 10.134.0.40:8643: connect: connection refused
(x5)

kserve-ci-e2e-test

kubelet

switch-graph-fc125-5c9566c648-xg996

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503