Time Namespace Component RelatedObject Reason Message

kserve-ci-e2e-test

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

model-chainer-raw-357ae-5d6b99669d-w894m

Scheduled

Successfully assigned kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m to ip-10-0-132-160.ec2.internal

kserve-ci-e2e-test

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Scheduled

Successfully assigned kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

Scheduled

Successfully assigned kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p to ip-10-0-132-160.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7 to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-init-fail-687035-predictor-857cbdf89c-qnws7

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7 to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Scheduled

Successfully assigned kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Scheduled

Successfully assigned kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Scheduled

Successfully assigned kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf to ip-10-0-134-36.ec2.internal

kserve-ci-e2e-test

v1beta1Controllers

isvc-raw-sklearn-batcher-199b4

UpdateFailed

Failed to update status for InferenceService "isvc-raw-sklearn-batcher-199b4": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-raw-sklearn-batcher-199b4": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

isvc-raw-sklearn-batcher-199b4-predictor

ScalingReplicaSet

Scaled up replica set isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

isvc-raw-sklearn-batcher-199b4

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-raw-sklearn-batcher-199b4": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f

SuccessfulCreate

Created pod: isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

kserve-ci-e2e-test

multus

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

AddedInterface

Add eth0 [10.134.0.17/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulling

Pulling image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f"

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" in 3.442s (3.442s including waiting). Image size: 301485528 bytes.

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Started

Started container storage-initializer
(x23)

kserve-ci-e2e-test

v1beta1Controllers

isvc-raw-sklearn-batcher-199b4

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulling

Pulling image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404"

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulled

Successfully pulled image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" in 13.194s (13.194s including waiting). Image size: 1560922562 bytes.

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulling

Pulling image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3"

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulled

Successfully pulled image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" in 2.307s (2.307s including waiting). Image size: 211946088 bytes.

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulling

Pulling image "quay.io/opendatahub/kserve-agent@sha256:de59d4f440abaeb1e71b5977a2145cdbe8db88ded8ac16ca09f179d82ba41738"

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Started

Started container agent

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-agent@sha256:de59d4f440abaeb1e71b5977a2145cdbe8db88ded8ac16ca09f179d82ba41738" in 2.357s (2.357s including waiting). Image size: 238051450 bytes.

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Created

Created container: agent
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-raw-sklearn-batcher-199b4-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-raw-sklearn-batcher-199b4-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-raw-sklearn-batcher-199b4-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-raw-sklearn-batcher-199b4-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

isvc-raw-sklearn-batcher-199b4

InferenceServiceReady

InferenceService [isvc-raw-sklearn-batcher-199b4] is Ready

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Killing

Stopping container agent

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5

SuccessfulCreate

Created pod: isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-raw-357ae-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-raw-357ae-predictor-56864bdff5 from 0 to 1

kserve-ci-e2e-test

deployment-controller

isvc-xgboost-graph-raw-357ae-predictor

ScalingReplicaSet

Scaled up replica set isvc-xgboost-graph-raw-357ae-predictor-f454f465 from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Created

Created container: storage-initializer

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-raw-357ae

UpdateFailed

Failed to update status for InferenceService "isvc-sklearn-graph-raw-357ae": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-sklearn-graph-raw-357ae": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

multus

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

AddedInterface

Add eth0 [10.134.0.18/23] from ovn-kubernetes

kserve-ci-e2e-test

multus

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

AddedInterface

Add eth0 [10.134.0.19/23] from ovn-kubernetes

kserve-ci-e2e-test

replicaset-controller

isvc-xgboost-graph-raw-357ae-predictor-f454f465

SuccessfulCreate

Created pod: isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Pulled

Container image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Pulling

Pulling image "kserve/xgbserver:latest"

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Started

Started container kserve-container
(x24)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-raw-357ae

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x10)

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x10)

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Unhealthy

Readiness probe failed: dial tcp 10.134.0.17:8080: connect: connection refused
(x25)

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph-raw-357ae

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x5)

kserve-ci-e2e-test

kubelet

isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn

Unhealthy

Readiness probe failed: Get "https://10.134.0.17:8643/healthz": dial tcp 10.134.0.17:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Pulled

Successfully pulled image "kserve/xgbserver:latest" in 19.235s (19.235s including waiting). Image size: 1306329851 bytes.

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Created

Created container: kserve-container
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-357ae-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-357ae-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-357ae-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-357ae-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-raw-357ae

InferenceServiceReady

InferenceService [isvc-sklearn-graph-raw-357ae] is Ready

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph-raw-357ae

InferenceServiceReady

InferenceService [isvc-xgboost-graph-raw-357ae] is Ready

kserve-ci-e2e-test

kubelet

model-chainer-raw-357ae-5d6b99669d-w894m

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "model-chainer-raw-357ae-serving-cert" not found

kserve-ci-e2e-test

replicaset-controller

model-chainer-raw-357ae-5d6b99669d

SuccessfulCreate

Created pod: model-chainer-raw-357ae-5d6b99669d-w894m

kserve-ci-e2e-test

deployment-controller

model-chainer-raw-357ae

ScalingReplicaSet

Scaled up replica set model-chainer-raw-357ae-5d6b99669d from 0 to 1

kserve-ci-e2e-test

InferenceGraphController

model-chainer-raw-357ae

InferenceGraphReady

InferenceGraph [model-chainer-raw-357ae] is Ready

kserve-ci-e2e-test

kubelet

model-chainer-raw-357ae-5d6b99669d-w894m

Pulling

Pulling image "quay.io/opendatahub/kserve-router@sha256:6a7f2e0065ff588673ad687410eb1f2b5ccfca9b42b71ed586d2fac5a64a6ca4"

kserve-ci-e2e-test

multus

model-chainer-raw-357ae-5d6b99669d-w894m

AddedInterface

Add eth0 [10.133.0.31/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

model-chainer-raw-357ae-5d6b99669d-w894m

Pulled

Successfully pulled image "quay.io/opendatahub/kserve-router@sha256:6a7f2e0065ff588673ad687410eb1f2b5ccfca9b42b71ed586d2fac5a64a6ca4" in 2.528s (2.528s including waiting). Image size: 216346748 bytes.

kserve-ci-e2e-test

kubelet

model-chainer-raw-357ae-5d6b99669d-w894m

Created

Created container: model-chainer-raw-357ae

kserve-ci-e2e-test

kubelet

model-chainer-raw-357ae-5d6b99669d-w894m

Started

Started container model-chainer-raw-357ae
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-357ae-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-357ae-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-357ae-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x5)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-357ae-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

model-chainer-raw-357ae-5d6b99669d-w894m

Killing

Stopping container model-chainer-raw-357ae

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff

SuccessfulCreate

Created pod: isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-graph-raw-hpa-5628a-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff from 0 to 1

kserve-ci-e2e-test

multus

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

AddedInterface

Add eth0 [10.134.0.21/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

deployment-controller

isvc-xgboost-graph-raw-hpa-5628a-predictor

ScalingReplicaSet

Scaled up replica set isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69 from 0 to 1

kserve-ci-e2e-test

multus

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

AddedInterface

Add eth0 [10.134.0.20/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Created

Created container: storage-initializer

kserve-ci-e2e-test

replicaset-controller

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69

SuccessfulCreate

Created pod: isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Unhealthy

Readiness probe failed: Get "https://10.134.0.19:8643/healthz": dial tcp 10.134.0.19:8643: connect: connection refused
(x10)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Unhealthy

Readiness probe failed: dial tcp 10.134.0.18:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf

Unhealthy

Readiness probe failed: Get "https://10.134.0.18:8643/healthz": dial tcp 10.134.0.18:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj

Unhealthy

Readiness probe failed: dial tcp 10.134.0.19:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Pulled

Container image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Pulled

Container image "kserve/xgbserver:latest" already present on machine

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine
(x25)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-raw-hpa-5628a

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x25)

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph-raw-hpa-5628a

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x6)

kserve-ci-e2e-test

kubelet

model-chainer-raw-357ae-5d6b99669d-w894m

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-hpa-5628a-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-hpa-5628a-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-hpa-5628a-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-hpa-5628a-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x9)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Unhealthy

Readiness probe failed: dial tcp 10.134.0.20:8080: connect: connection refused

kserve-ci-e2e-test

v1beta1Controllers

isvc-xgboost-graph-raw-hpa-5628a

InferenceServiceReady

InferenceService [isvc-xgboost-graph-raw-hpa-5628a] is Ready

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-graph-raw-hpa-5628a

InferenceServiceReady

InferenceService [isvc-sklearn-graph-raw-hpa-5628a] is Ready
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-hpa-5628a-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

InferenceGraphController

model-chainer-raw-hpa-5628a

InferenceGraphReady

InferenceGraph [model-chainer-raw-hpa-5628a] is Ready

kserve-ci-e2e-test

deployment-controller

model-chainer-raw-hpa-5628a

ScalingReplicaSet

Scaled up replica set model-chainer-raw-hpa-5628a-687c9b99db from 0 to 1

kserve-ci-e2e-test

replicaset-controller

model-chainer-raw-hpa-5628a-687c9b99db

SuccessfulCreate

Created pod: model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

kserve-ci-e2e-test

kubelet

model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

Started

Started container model-chainer-raw-hpa-5628a

kserve-ci-e2e-test

kubelet

model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

Pulled

Container image "quay.io/opendatahub/kserve-router@sha256:6a7f2e0065ff588673ad687410eb1f2b5ccfca9b42b71ed586d2fac5a64a6ca4" already present on machine
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-graph-raw-hpa-5628a-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

multus

model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

AddedInterface

Add eth0 [10.133.0.32/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

Created

Created container: model-chainer-raw-hpa-5628a

kserve-ci-e2e-test

InferenceGraphController

model-chainer-raw-hpa-5628a

InternalError

fails to update InferenceGraph status: Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "model-chainer-raw-hpa-5628a": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

InferenceGraphController

model-chainer-raw-hpa-5628a

UpdateFailed

Failed to update status for InferenceGraph "model-chainer-raw-hpa-5628a": Operation cannot be fulfilled on inferencegraphs.serving.kserve.io "model-chainer-raw-hpa-5628a": the object has been modified; please apply your changes to the latest version and try again
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-hpa-5628a-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-xgboost-graph-raw-hpa-5628a-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

message-dumper-raw-a8bff-predictor-6d6466ff77

SuccessfulCreate

Created pod: message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

message-dumper-raw-a8bff-predictor

ScalingReplicaSet

Scaled up replica set message-dumper-raw-a8bff-predictor-6d6466ff77 from 0 to 1

kserve-ci-e2e-test

kubelet

model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

Killing

Stopping container model-chainer-raw-hpa-5628a

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Killing

Stopping container kserve-container
(x2)

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "message-dumper-raw-a8bff-predictor-serving-cert" not found

kserve-ci-e2e-test

multus

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

AddedInterface

Add eth0 [10.134.0.22/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf

Unhealthy

Readiness probe failed: Get "https://10.134.0.20:8643/healthz": dial tcp 10.134.0.20:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Pulling

Pulling image "gcr.io/knative-releases/knative.dev/eventing-contrib/cmd/event_display"

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Pulled

Successfully pulled image "gcr.io/knative-releases/knative.dev/eventing-contrib/cmd/event_display" in 1.058s (1.058s including waiting). Image size: 14813193 bytes.
(x9)

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Unhealthy

Readiness probe failed: dial tcp 10.134.0.21:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t

Unhealthy

Readiness probe failed: Get "https://10.134.0.21:8643/healthz": dial tcp 10.134.0.21:8643: connect: connection refused

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Created

Created container: kserve-container
(x25)

kserve-ci-e2e-test

v1beta1Controllers

message-dumper-raw-a8bff

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

message-dumper-raw-a8bff

InferenceServiceReady

InferenceService [message-dumper-raw-a8bff] is Ready

kserve-ci-e2e-test

deployment-controller

isvc-logger-raw-a8bff-predictor

ScalingReplicaSet

Scaled up replica set isvc-logger-raw-a8bff-predictor-b8fd65b67 from 0 to 1

kserve-ci-e2e-test

replicaset-controller

isvc-logger-raw-a8bff-predictor-b8fd65b67

SuccessfulCreate

Created pod: isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

multus

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

AddedInterface

Add eth0 [10.134.0.23/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Started

Started container agent

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Created

Created container: agent

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Pulled

Container image "quay.io/opendatahub/kserve-agent@sha256:de59d4f440abaeb1e71b5977a2145cdbe8db88ded8ac16ca09f179d82ba41738" already present on machine

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Pulled

Container image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" already present on machine
(x6)

kserve-ci-e2e-test

kubelet

model-chainer-raw-hpa-5628a-687c9b99db-wsj4p

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

message-dumper-raw-a8bff-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

message-dumper-raw-a8bff-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x25)

kserve-ci-e2e-test

v1beta1Controllers

isvc-logger-raw-a8bff

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-logger-raw-a8bff-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-logger-raw-a8bff-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

isvc-logger-raw-a8bff

InferenceServiceReady

InferenceService [isvc-logger-raw-a8bff] is Ready
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-logger-raw-a8bff-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-logger-raw-a8bff-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq

Killing

Stopping container kserve-container

kserve-ci-e2e-test

deployment-controller

isvc-sklearn-scale-raw-9745c-predictor

ScalingReplicaSet

Scaled up replica set isvc-sklearn-scale-raw-9745c-predictor-74db4886f from 0 to 1

kserve-ci-e2e-test

multus

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

AddedInterface

Add eth0 [10.134.0.24/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Killing

Stopping container agent

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

replicaset-controller

isvc-sklearn-scale-raw-9745c-predictor-74db4886f

SuccessfulCreate

Created pod: isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Pulled

Container image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Started

Started container kserve-container
(x25)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-scale-raw-9745c

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x10)

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Unhealthy

Readiness probe failed: dial tcp 10.134.0.23:8080: connect: connection refused
(x10)

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Unhealthy

Readiness probe failed: HTTP probe failed with statuscode: 503
(x5)

kserve-ci-e2e-test

kubelet

isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp

Unhealthy

Readiness probe failed: Get "https://10.134.0.23:8643/healthz": dial tcp 10.134.0.23:8643: connect: connection refused
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-scale-raw-9745c-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-scale-raw-9745c-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x6)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-scale-raw-9745c-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x6)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-sklearn-scale-raw-9745c-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

v1beta1Controllers

isvc-sklearn-scale-raw-9745c

InferenceServiceReady

InferenceService [isvc-sklearn-scale-raw-9745c] is Ready

kserve-ci-e2e-test

replicaset-controller

isvc-primary-b9e6a0-predictor-5bdbbd7cbd

SuccessfulCreate

Created pod: isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

kserve-ci-e2e-test

deployment-controller

isvc-primary-b9e6a0-predictor

ScalingReplicaSet

Scaled up replica set isvc-primary-b9e6a0-predictor-5bdbbd7cbd from 0 to 1

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Killing

Stopping container kserve-container

kserve-ci-e2e-test

multus

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

AddedInterface

Add eth0 [10.134.0.25/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Pulled

Container image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" already present on machine
(x16)

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Unhealthy

Readiness probe failed: dial tcp 10.134.0.24:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Created

Created container: kserve-container
(x2)

kserve-ci-e2e-test

kubelet

isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7

Unhealthy

Readiness probe failed: Get "https://10.134.0.24:8643/healthz": dial tcp 10.134.0.24:8643: connect: connection refused
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-primary-b9e6a0-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-primary-b9e6a0-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x8)

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Unhealthy

Readiness probe failed: dial tcp 10.134.0.25:8080: connect: connection refused
(x10)

kserve-ci-e2e-test

v1beta1Controllers

isvc-primary-b9e6a0

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

v1beta1Controllers

isvc-primary-b9e6a0

InferenceServiceReady

InferenceService [isvc-primary-b9e6a0] is Ready
(x2)

kserve-ci-e2e-test

v1beta1Controllers

isvc-secondary-b9e6a0

UpdateFailed

Failed to update status for InferenceService "isvc-secondary-b9e6a0": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-secondary-b9e6a0": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

deployment-controller

isvc-secondary-b9e6a0-predictor

ScalingReplicaSet

Scaled up replica set isvc-secondary-b9e6a0-predictor-5d7d9858d7 from 0 to 1
(x2)

kserve-ci-e2e-test

v1beta1Controllers

isvc-secondary-b9e6a0

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-secondary-b9e6a0": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

isvc-secondary-b9e6a0-predictor-5d7d9858d7

SuccessfulCreate

Created pod: isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

kserve-ci-e2e-test

kubelet

isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "isvc-secondary-b9e6a0-predictor-serving-cert" not found

kserve-ci-e2e-test

multus

isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

AddedInterface

Add eth0 [10.134.0.26/23] from ovn-kubernetes
(x2)

kserve-ci-e2e-test

kubelet

isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

Started

Started container storage-initializer
(x2)

kserve-ci-e2e-test

kubelet

isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

Created

Created container: storage-initializer
(x2)

kserve-ci-e2e-test

kubelet

isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-primary-b9e6a0-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-primary-b9e6a0-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg

BackOff

Back-off restarting failed container storage-initializer in pod isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_kserve-ci-e2e-test(a43f12ab-b64a-4dee-aea6-defd4e555c99)
(x14)

kserve-ci-e2e-test

v1beta1Controllers

isvc-secondary-b9e6a0

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-secondary-b9e6a0-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

isvc-secondary-b9e6a0-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

kubelet

isvc-init-fail-687035-predictor-857cbdf89c-qnws7

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "isvc-init-fail-687035-predictor-serving-cert" not found

kserve-ci-e2e-test

deployment-controller

isvc-init-fail-687035-predictor

ScalingReplicaSet

Scaled up replica set isvc-init-fail-687035-predictor-857cbdf89c from 0 to 1

kserve-ci-e2e-test

v1beta1Controllers

isvc-init-fail-687035

UpdateFailed

Failed to update status for InferenceService "isvc-init-fail-687035": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-init-fail-687035": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

v1beta1Controllers

isvc-init-fail-687035

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "isvc-init-fail-687035": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

replicaset-controller

isvc-init-fail-687035-predictor-857cbdf89c

SuccessfulCreate

Created pod: isvc-init-fail-687035-predictor-857cbdf89c-qnws7

kserve-ci-e2e-test

multus

isvc-init-fail-687035-predictor-857cbdf89c-qnws7

AddedInterface

Add eth0 [10.134.0.27/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n

Unhealthy

Readiness probe failed: Get "https://10.134.0.25:8643/healthz": dial tcp 10.134.0.25:8643: connect: connection refused
(x2)

kserve-ci-e2e-test

kubelet

isvc-init-fail-687035-predictor-857cbdf89c-qnws7

Started

Started container storage-initializer
(x2)

kserve-ci-e2e-test

kubelet

isvc-init-fail-687035-predictor-857cbdf89c-qnws7

Created

Created container: storage-initializer
(x2)

kserve-ci-e2e-test

kubelet

isvc-init-fail-687035-predictor-857cbdf89c-qnws7

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine
(x12)

kserve-ci-e2e-test

v1beta1Controllers

isvc-init-fail-687035

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

kubelet

isvc-init-fail-687035-predictor-857cbdf89c-qnws7

BackOff

Back-off restarting failed container storage-initializer in pod isvc-init-fail-687035-predictor-857cbdf89c-qnws7_kserve-ci-e2e-test(df050d88-56c8-49d8-b981-4aaf0edfc7cf)

kserve-ci-e2e-test

v1beta1Controllers

raw-sklearn-3e269

InternalError

fails to update InferenceService status: Operation cannot be fulfilled on inferenceservices.serving.kserve.io "raw-sklearn-3e269": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

FailedMount

MountVolume.SetUp failed for volume "proxy-tls" : secret "raw-sklearn-3e269-predictor-serving-cert" not found
(x2)

kserve-ci-e2e-test

v1beta1Controllers

raw-sklearn-3e269

UpdateFailed

Failed to update status for InferenceService "raw-sklearn-3e269": Operation cannot be fulfilled on inferenceservices.serving.kserve.io "raw-sklearn-3e269": the object has been modified; please apply your changes to the latest version and try again

kserve-ci-e2e-test

replicaset-controller

raw-sklearn-3e269-predictor-75dc5c954c

SuccessfulCreate

Created pod: raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

kserve-ci-e2e-test

deployment-controller

raw-sklearn-3e269-predictor

ScalingReplicaSet

Scaled up replica set raw-sklearn-3e269-predictor-75dc5c954c from 0 to 1

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Created

Created container: storage-initializer

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Started

Started container storage-initializer

kserve-ci-e2e-test

multus

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

AddedInterface

Add eth0 [10.134.0.28/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Created

Created container: kserve-container

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Pulled

Container image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" already present on machine

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Started

Started container kserve-container
(x22)

kserve-ci-e2e-test

v1beta1Controllers

raw-sklearn-3e269

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-3e269-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-3e269-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

raw-sklearn-3e269

InferenceServiceReady

InferenceService [raw-sklearn-3e269] is Ready
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-3e269-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x4)

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-3e269-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

deployment-controller

raw-sklearn-runtime-cdca2-predictor

ScalingReplicaSet

Scaled up replica set raw-sklearn-runtime-cdca2-predictor-65dc66dfd5 from 0 to 1

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Started

Started container storage-initializer

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

multus

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

AddedInterface

Add eth0 [10.134.0.29/23] from ovn-kubernetes

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Pulled

Container image "quay.io/opendatahub/kserve-storage-initializer@sha256:3460f014e7dc0a9d3daafe0716ca9eadf865f2892e0a5103d0b876da9f34891f" already present on machine

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Created

Created container: storage-initializer

kserve-ci-e2e-test

replicaset-controller

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5

SuccessfulCreate

Created pod: raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Unhealthy

Readiness probe failed: Get "https://10.134.0.28:8643/healthz": dial tcp 10.134.0.28:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

raw-sklearn-3e269-predictor-75dc5c954c-zt7nt

Unhealthy

Readiness probe failed: dial tcp 10.134.0.28:8080: connect: connection refused

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Pulled

Container image "quay.io/opendatahub/sklearn-serving-runtime:odh-pr-1404" already present on machine

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Pulled

Container image "quay.io/opendatahub/odh-kube-auth-proxy@sha256:dcb09fbabd8811f0956ef612a0c9ddd5236804b9bd6548a0647d2b531c9d01b3" already present on machine

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Created

Created container: kube-rbac-proxy

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Started

Started container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Started

Started container kserve-container

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Created

Created container: kserve-container
(x25)

kserve-ci-e2e-test

v1beta1Controllers

raw-sklearn-runtime-cdca2

VirtualServiceCRDNotFound

Istio VirtualService CRD not present; VirtualService reconciliation skipped. If you do not use Istio, set ingress.disableIstioVirtualHost=true.
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-runtime-cdca2-predictor

FailedGetResourceMetric

failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API
(x3)

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-runtime-cdca2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: unable to get metrics for resource cpu: no metrics returned from resource metrics API

kserve-ci-e2e-test

v1beta1Controllers

raw-sklearn-runtime-cdca2

InferenceServiceReady

InferenceService [raw-sklearn-runtime-cdca2] is Ready
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-runtime-cdca2-predictor

FailedGetResourceMetric

failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)
(x2)

kserve-ci-e2e-test

horizontal-pod-autoscaler

raw-sklearn-runtime-cdca2-predictor

FailedComputeMetricsReplicas

invalid metrics (1 invalid out of 1), first error is: failed to get cpu resource metric value: failed to get cpu utilization: did not receive metrics for targeted pods (pods might be unready)

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Killing

Stopping container kube-rbac-proxy

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Killing

Stopping container kserve-container

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Unhealthy

Readiness probe failed: Get "https://10.134.0.29:8643/healthz": dial tcp 10.134.0.29:8643: connect: connection refused
(x9)

kserve-ci-e2e-test

kubelet

raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf

Unhealthy

Readiness probe failed: dial tcp 10.134.0.29:8080: connect: connection refused