Apr 24 21:26:39.042614 ip-10-0-139-5 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:39.450978 ip-10-0-139-5 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:39.450978 ip-10-0-139-5 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:39.450978 ip-10-0-139-5 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:39.450978 ip-10-0-139-5 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:39.450978 ip-10-0-139-5 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:39.453727 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.453632 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459487 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459508 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459512 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459517 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459520 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459523 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459527 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:39.459517 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459530 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459533 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459536 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459538 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459541 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459544 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459546 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459549 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459551 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459554 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459556 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459558 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459561 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459564 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459567 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459569 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459572 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459574 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459577 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459587 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:39.459830 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459590 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459593 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459595 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459598 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459600 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459603 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459605 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459608 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459610 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459613 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459615 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459618 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459620 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459622 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459626 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459629 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459633 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459635 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459639 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459642 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:39.460317 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459644 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459647 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459649 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459652 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459655 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459657 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459660 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459662 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459665 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459667 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459670 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459673 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459675 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459678 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459680 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459683 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459686 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459688 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459691 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:39.460839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459693 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459695 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459699 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459703 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459705 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459708 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459711 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459714 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459717 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459719 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459723 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459726 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459729 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459732 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459735 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459737 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459740 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459744 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459748 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:39.461320 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.459751 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460157 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460163 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460167 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460171 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460174 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460176 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460179 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460182 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460184 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460187 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460189 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460192 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460194 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460197 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460199 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460202 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460204 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460207 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460209 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:39.461790 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460212 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460215 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460217 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460220 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460222 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460225 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460228 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460231 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460233 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460236 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460238 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460241 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460244 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460246 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460248 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460251 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460253 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460255 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460258 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460260 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:39.462331 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460263 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460265 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460268 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460272 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460276 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460279 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460282 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460284 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460287 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460290 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460313 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460316 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460321 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460324 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460327 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460330 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460333 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460335 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460339 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460342 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:39.462871 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460344 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460347 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460349 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460352 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460354 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460357 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460360 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460362 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460365 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460367 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460370 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460372 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460375 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460377 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460379 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460382 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460385 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460387 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460390 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:39.463384 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460392 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460396 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460398 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460401 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460403 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460407 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460410 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.460412 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460489 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460497 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460503 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460512 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460517 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460521 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460525 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460529 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460532 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460535 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460539 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460542 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460546 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460550 2571 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:39.463850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460554 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460559 2571 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460563 2571 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460568 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460572 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460578 2571 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460581 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460584 2571 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460587 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460590 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460594 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460597 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460600 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460603 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460606 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460610 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460613 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460616 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460619 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460623 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460627 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460629 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460632 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460636 2571 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460639 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:39.464406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460644 2571 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460648 2571 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460651 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460654 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460656 2571 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460660 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460663 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460666 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460669 2571 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460671 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460674 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460677 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460681 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460683 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460686 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460689 2571 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460693 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460696 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460699 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460701 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460704 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460716 2571 flags.go:64] FLAG: --help="false" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460720 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460723 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:39.465013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460726 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460729 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460732 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460736 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460739 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460741 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460744 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460747 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460751 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460754 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460757 2571 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460759 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460762 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460765 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460768 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460770 2571 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460773 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460776 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460778 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460784 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460786 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460789 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460792 2571 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:39.465609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460795 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460798 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460801 2571 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460803 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460808 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460811 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460815 2571 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460820 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460823 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460826 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460829 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460832 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460834 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460837 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460846 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460849 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460852 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460855 2571 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460859 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460864 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460867 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460870 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460872 2571 flags.go:64] FLAG: --port="10250" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460876 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:39.466162 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460879 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fd6aae17fb3a64b8" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460882 2571 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460885 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460888 2571 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460890 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460893 2571 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460896 2571 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460899 2571 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460902 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460905 2571 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460909 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460912 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460915 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460918 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460920 2571 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460924 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460927 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460930 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460933 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460936 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460939 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460942 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460945 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460948 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460951 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460954 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:39.466787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460957 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460960 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460963 2571 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460969 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460974 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460977 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460980 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460984 2571 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460987 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460989 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460992 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460995 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.460997 2571 flags.go:64] FLAG: --v="2" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.461002 2571 flags.go:64] FLAG: --version="false" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.461005 2571 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.461009 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.461013 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461113 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461117 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461120 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461123 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461127 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461129 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461132 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:39.467466 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461134 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461137 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461139 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461142 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461145 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461147 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461150 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461152 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461155 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461158 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461160 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461164 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461167 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461169 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461171 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461174 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461176 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461179 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461181 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:39.468044 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461183 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461186 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461189 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461191 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461194 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461196 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461199 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461202 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461204 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461206 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461210 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461213 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461215 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461218 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461220 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461223 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461225 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461228 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461230 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461232 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:39.468555 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461235 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461237 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461239 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461243 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461247 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461250 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461254 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461258 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461261 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461263 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461266 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461269 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461271 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461273 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461276 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461279 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461281 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461283 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461286 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461288 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:39.469060 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461304 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461307 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461313 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461316 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461319 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461322 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461324 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461327 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461330 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461333 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461335 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461338 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461340 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461343 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461346 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461349 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461353 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461357 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461360 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:39.469565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.461363 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.462203 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.469362 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.469378 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469425 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469430 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469433 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469436 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469439 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469442 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469445 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469448 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469451 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469454 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469456 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:39.470039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469459 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469462 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469464 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469467 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469470 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469473 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469475 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469478 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469481 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469483 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469487 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469491 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469494 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469497 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469499 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469502 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469504 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469507 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469510 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:39.470471 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469512 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469517 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469520 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469522 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469524 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469527 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469529 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469532 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469534 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469536 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469538 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469541 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469543 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469546 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469548 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469551 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469553 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469555 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469559 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469561 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:39.470931 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469563 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469566 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469569 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469571 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469575 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469579 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469582 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469584 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469586 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469589 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469591 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469593 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469596 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469599 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469602 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469605 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469607 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469609 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469612 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469614 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:39.471441 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469617 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469619 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469622 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469624 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469626 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469629 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469631 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469633 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469637 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469639 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469641 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469644 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469646 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469648 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469651 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:39.471955 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469653 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.469658 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469758 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469764 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469767 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469770 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469772 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469774 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469777 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469780 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469782 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469787 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469791 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469795 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469798 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:39.472345 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469801 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469803 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469806 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469809 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469812 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469815 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469817 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469819 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469822 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469825 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469827 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469829 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469832 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469834 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469837 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469839 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469841 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469844 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469846 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469849 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:39.472720 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469851 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469853 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469856 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469858 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469861 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469863 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469866 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469868 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469871 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469874 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469877 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469879 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469882 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469884 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469887 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469889 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469892 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469894 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469896 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469899 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:39.473287 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469901 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469903 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469906 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469908 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469910 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469913 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469915 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469917 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469920 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469922 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469925 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469927 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469930 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469932 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469934 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469937 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469939 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469942 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469944 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469946 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:39.473839 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469949 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469952 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469954 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469957 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469959 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469962 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469965 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469968 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469970 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469973 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469975 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469978 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:39.469980 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.469984 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:39.474470 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.470622 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:39.474894 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.472910 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:39.474894 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.473848 2571 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:39.474894 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.473966 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:39.474894 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.474676 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:39.496787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.496761 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:39.500667 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.500644 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:39.520521 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.520498 2571 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:39.525751 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.525730 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:39.526901 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.526888 2571 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:39.528924 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.528908 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:39.532627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.532607 2571 fs.go:135] Filesystem UUIDs: map[5e1294cb-4ed3-4ef9-82f5-daae44cfc2e8:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ff64feca-80fc-4abc-954f-8008f5dcc817:/dev/nvme0n1p3] Apr 24 21:26:39.532712 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.532628 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:39.538396 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.538276 2571 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:39.536405878 +0000 UTC m=+0.384449985 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098532 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2013cc3dca18415ffac256a2d09b9e SystemUUID:ec2013cc-3dca-1841-5ffa-c256a2d09b9e BootID:22ed518f-5d8a-4a1d-b304-19ac99f09b64 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d3:55:c0:84:d7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d3:55:c0:84:d7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:4a:1b:aa:2d:ac Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:39.538396 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.538392 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:39.538499 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.538474 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:39.540889 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.540866 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:39.541035 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.540891 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-5.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:39.541079 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.541044 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:39.541079 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.541052 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:39.541079 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.541069 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:39.541748 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.541737 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:39.543496 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.543486 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:39.543598 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.543588 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:39.546477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.546465 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:39.546513 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.546484 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:39.546513 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.546497 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:39.546513 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.546508 2571 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:39.546601 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.546519 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:39.547587 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.547572 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:39.547629 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.547600 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:39.550540 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.550526 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:39.552135 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.552122 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:39.553817 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553806 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:39.553861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553822 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:39.553861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553828 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:39.553861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553834 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:39.553861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553839 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:39.553861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553847 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:39.553861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553852 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:39.553861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553858 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:39.554042 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553865 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:39.554042 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553871 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:39.554042 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553895 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:39.554042 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.553904 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:39.555453 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.555443 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:39.555453 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.555453 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:39.558227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.558113 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vn7bl" Apr 24 21:26:39.558930 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.558805 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:39.559034 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.558956 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:39.559979 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.559957 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:39.560114 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.560104 2571 server.go:1295] "Started kubelet" Apr 24 21:26:39.560243 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.560217 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:39.560531 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.560475 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:39.560604 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.560550 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:39.561149 ip-10-0-139-5 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:39.562039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.561942 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-5.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:39.562760 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.562744 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:39.565217 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.565198 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:39.565373 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.565352 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vn7bl" Apr 24 21:26:39.570340 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.570318 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:39.572783 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.572767 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:39.573257 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.573245 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:39.573957 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.573941 2571 factory.go:55] Registering systemd factory Apr 24 21:26:39.574046 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574005 2571 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:39.574184 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574047 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:39.574222 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574050 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:39.574222 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574199 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:39.574273 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574257 2571 factory.go:153] Registering CRI-O factory Apr 24 21:26:39.574327 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574273 2571 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:39.574377 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574332 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:39.574377 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574340 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:39.574377 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574368 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:39.574463 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574394 2571 factory.go:103] Registering Raw factory Apr 24 21:26:39.574463 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574411 2571 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:39.574628 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.574496 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:39.574803 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.574791 2571 manager.go:319] Starting recovery of all containers Apr 24 21:26:39.576508 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.576489 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:39.580628 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.580590 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-5.ec2.internal\" not found" node="ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.586278 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.586258 2571 manager.go:324] Recovery completed Apr 24 21:26:39.590350 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.590336 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:39.593050 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.593035 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:39.593120 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.593064 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:39.593120 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.593074 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:39.593564 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.593551 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:39.593564 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.593563 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:39.593672 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.593579 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:39.596265 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.596253 2571 policy_none.go:49] "None policy: Start" Apr 24 21:26:39.596324 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.596269 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:39.596324 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.596280 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:39.631903 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.631884 2571 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.631927 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.631940 2571 server.go:85] "Starting device plugin registration server" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.632214 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.632228 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.632417 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.632527 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.632537 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.633095 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:39.641029 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.633131 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:39.641702 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.641680 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:39.643111 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.643097 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:39.643184 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.643121 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:39.643184 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.643137 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:39.643184 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.643143 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:39.643184 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.643177 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:39.647028 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.647010 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:39.733354 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.733274 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:39.734326 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.734288 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:39.734391 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.734345 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:39.734391 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.734356 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:39.734391 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.734380 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.743155 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.743139 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.743225 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.743163 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-5.ec2.internal\": node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:39.743267 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.743242 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal"] Apr 24 21:26:39.743333 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.743323 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:39.744208 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.744195 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:39.744272 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.744220 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:39.744272 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.744230 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:39.745755 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.745743 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:39.745898 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.745875 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.745898 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.745904 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:39.746426 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.746410 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:39.746516 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.746431 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:39.746516 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.746456 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:39.746516 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.746471 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:39.746516 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.746435 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:39.746708 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.746525 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:39.747608 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.747592 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.747680 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.747622 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:39.748217 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.748200 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:39.748310 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.748233 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:39.748310 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.748248 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:39.763497 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.763477 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:39.763910 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.763896 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-5.ec2.internal\" not found" node="ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.768107 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.768089 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-5.ec2.internal\" not found" node="ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.775454 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.775437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76273891b5ff00ed3370baaa1995c530-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal\" (UID: \"76273891b5ff00ed3370baaa1995c530\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.775533 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.775462 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76273891b5ff00ed3370baaa1995c530-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal\" (UID: \"76273891b5ff00ed3370baaa1995c530\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.775533 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.775478 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb850784b78b1ad1b82871c81c9b1b43-config\") pod \"kube-apiserver-proxy-ip-10-0-139-5.ec2.internal\" (UID: \"fb850784b78b1ad1b82871c81c9b1b43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.864593 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.864564 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:39.875985 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.875956 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76273891b5ff00ed3370baaa1995c530-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal\" (UID: \"76273891b5ff00ed3370baaa1995c530\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.875985 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.875987 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76273891b5ff00ed3370baaa1995c530-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal\" (UID: \"76273891b5ff00ed3370baaa1995c530\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.876141 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.876004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb850784b78b1ad1b82871c81c9b1b43-config\") pod \"kube-apiserver-proxy-ip-10-0-139-5.ec2.internal\" (UID: \"fb850784b78b1ad1b82871c81c9b1b43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.876141 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.876042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb850784b78b1ad1b82871c81c9b1b43-config\") pod \"kube-apiserver-proxy-ip-10-0-139-5.ec2.internal\" (UID: \"fb850784b78b1ad1b82871c81c9b1b43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.876141 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.876042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76273891b5ff00ed3370baaa1995c530-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal\" (UID: \"76273891b5ff00ed3370baaa1995c530\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.876141 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:39.876070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76273891b5ff00ed3370baaa1995c530-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal\" (UID: \"76273891b5ff00ed3370baaa1995c530\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:39.965030 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:39.964989 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:40.065800 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:40.065768 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:40.065800 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.065778 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:40.071035 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.071012 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" Apr 24 21:26:40.166615 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:40.166578 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:40.267170 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:40.267137 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:40.367740 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:40.367651 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:40.468239 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:40.468205 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:40.473378 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.473353 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:40.473532 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.473515 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:40.473584 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.473555 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:40.529883 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.529852 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:40.546688 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:40.546657 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76273891b5ff00ed3370baaa1995c530.slice/crio-fbb720e347f5f1a40f557706ed9e9b023734c4a74674967ec33b69dc5fbb3a94 WatchSource:0}: Error finding container fbb720e347f5f1a40f557706ed9e9b023734c4a74674967ec33b69dc5fbb3a94: Status 404 returned error can't find the container with id fbb720e347f5f1a40f557706ed9e9b023734c4a74674967ec33b69dc5fbb3a94 Apr 24 21:26:40.546945 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:40.546917 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb850784b78b1ad1b82871c81c9b1b43.slice/crio-f31a77fa37209be48f3b2f6527e4b63f005fb57bde151b2e2f11d20c60e00444 WatchSource:0}: Error finding container f31a77fa37209be48f3b2f6527e4b63f005fb57bde151b2e2f11d20c60e00444: Status 404 returned error can't find the container with id f31a77fa37209be48f3b2f6527e4b63f005fb57bde151b2e2f11d20c60e00444 Apr 24 21:26:40.550735 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.550720 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:40.568251 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.568222 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:39 +0000 UTC" deadline="2027-09-24 15:39:09.992393367 +0000 UTC" Apr 24 21:26:40.568251 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.568249 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12426h12m29.424146267s" Apr 24 21:26:40.568390 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:40.568276 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-5.ec2.internal\" not found" Apr 24 21:26:40.573863 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.573836 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:40.585913 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.585889 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:40.603501 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.603481 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4vdxb" Apr 24 21:26:40.612746 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.612722 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4vdxb" Apr 24 21:26:40.635464 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.635420 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:40.646359 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.646309 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" event={"ID":"76273891b5ff00ed3370baaa1995c530","Type":"ContainerStarted","Data":"fbb720e347f5f1a40f557706ed9e9b023734c4a74674967ec33b69dc5fbb3a94"} Apr 24 21:26:40.647274 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.647253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" event={"ID":"fb850784b78b1ad1b82871c81c9b1b43","Type":"ContainerStarted","Data":"f31a77fa37209be48f3b2f6527e4b63f005fb57bde151b2e2f11d20c60e00444"} Apr 24 21:26:40.674113 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.674091 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" Apr 24 21:26:40.685703 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.685684 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:40.686527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.686516 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" Apr 24 21:26:40.699431 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:40.699415 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:41.546908 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.546873 2571 apiserver.go:52] "Watching apiserver" Apr 24 21:26:41.554784 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.554760 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:41.555137 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.555113 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4","openshift-dns/node-resolver-gh9lb","openshift-multus/multus-2gkkz","openshift-multus/network-metrics-daemon-npqvg","openshift-network-diagnostics/network-check-target-77b2w","openshift-network-operator/iptables-alerter-7wctt","openshift-ovn-kubernetes/ovnkube-node-k487s","kube-system/konnectivity-agent-xnxmv","openshift-cluster-node-tuning-operator/tuned-jk496","openshift-image-registry/node-ca-ws882","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal","openshift-multus/multus-additional-cni-plugins-phlsx","kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal"] Apr 24 21:26:41.557271 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.557255 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.558334 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.558316 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.560268 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.560132 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:41.560381 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.560259 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:41.560485 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.560460 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:41.561057 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.560603 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:41.561057 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.560639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.561515 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.561492 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:41.561606 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.561500 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:41.561676 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.561600 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lkgpw\"" Apr 24 21:26:41.561892 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.561862 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:41.561988 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.561968 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:41.562071 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.561972 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:41.562159 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.562115 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:41.563009 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.562985 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84r2g\"" Apr 24 21:26:41.565978 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.563125 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-98czg\"" Apr 24 21:26:41.565978 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.563405 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:41.565978 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.563458 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:41.565978 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.563612 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:41.565978 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.563689 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:41.565978 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.564251 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:41.565978 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.564521 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:41.567106 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.567082 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.568313 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.567362 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.569781 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.569609 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:41.570022 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.570002 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:41.570110 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.570054 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qjzxr\"" Apr 24 21:26:41.570110 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.570103 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:41.570262 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.570245 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:41.570357 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.570284 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rr9p4\"" Apr 24 21:26:41.570357 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.570315 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:41.570748 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.570730 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:41.571392 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.571360 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.571489 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.571450 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.572760 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.572741 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.573581 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.573561 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:41.573681 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.573571 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:41.573991 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.573971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2bfqq\"" Apr 24 21:26:41.574119 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.573972 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:41.574119 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.574036 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.574704 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.574646 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:41.574704 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.574660 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zthhh\"" Apr 24 21:26:41.575081 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.574908 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:41.575081 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.574992 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:41.575081 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.575050 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rb7nx\"" Apr 24 21:26:41.575282 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.575213 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:41.575437 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.575422 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:41.576169 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.576155 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:41.576254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.576240 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gntnh\"" Apr 24 21:26:41.576737 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.576719 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:41.584692 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-os-release\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.584794 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-netns\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.584794 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-os-release\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.584794 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.584951 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584793 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysconfig\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.584951 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584815 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-tuned\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.584951 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584874 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjjd9\" (UniqueName: \"kubernetes.io/projected/1e7352a7-e690-4558-a1e5-876926d3a57f-kube-api-access-pjjd9\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.584951 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-conf-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.585116 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584949 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:41.585116 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.584990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6fdk\" (UniqueName: \"kubernetes.io/projected/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-kube-api-access-r6fdk\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.585116 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-k8s-cni-cncf-io\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.585116 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-ovn\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.585254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-node-log\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.585254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585170 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-registration-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.585254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysctl-conf\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.585254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-run-netns\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.585443 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585275 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-env-overrides\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.585443 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-socket-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.585443 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j57j\" (UniqueName: \"kubernetes.io/projected/6e0edc9b-0c6c-4c56-a267-47907a7053fd-kube-api-access-8j57j\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.585443 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585373 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-kubelet\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.585443 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.585443 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:41.585443 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585430 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhknx\" (UniqueName: \"kubernetes.io/projected/1d60af74-5e4a-4c56-8738-0ea78867d785-kube-api-access-fhknx\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585450 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-run\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-lib-modules\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95bfde24-898e-4ab6-9414-d93c895b9ba6-hosts-file\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/95bfde24-898e-4ab6-9414-d93c895b9ba6-tmp-dir\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tdnt\" (UniqueName: \"kubernetes.io/projected/95bfde24-898e-4ab6-9414-d93c895b9ba6-kube-api-access-9tdnt\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-daemon-config\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585667 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-multus-certs\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585702 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.585764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d60af74-5e4a-4c56-8738-0ea78867d785-host-slash\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-device-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-cni-netd\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovn-node-metrics-cert\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585852 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cnibin\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-cni-bin\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.585954 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-etc-kubernetes\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586010 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/205a6f27-fa4b-46d4-bf0b-d91aa8cf134c-konnectivity-ca\") pod \"konnectivity-agent-xnxmv\" (UID: \"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c\") " pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-var-lib-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586078 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86t9n\" (UniqueName: \"kubernetes.io/projected/7f06c4fe-10e6-4600-864c-07dad67ed49f-kube-api-access-86t9n\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586113 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-sys\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.586134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586131 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-var-lib-kubelet\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-host\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e7352a7-e690-4558-a1e5-876926d3a57f-tmp\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-kubernetes\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586241 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee2e975b-8948-45ed-9de6-345f4c54c29e-serviceca\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586258 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f06c4fe-10e6-4600-864c-07dad67ed49f-cni-binary-copy\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-socket-dir-parent\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586331 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-log-socket\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586373 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1d60af74-5e4a-4c56-8738-0ea78867d785-iptables-alerter-script\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586394 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-systemd\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586410 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-hostroot\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-systemd-units\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586471 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-etc-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.586507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovnkube-config\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586557 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-598ns\" (UniqueName: \"kubernetes.io/projected/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-kube-api-access-598ns\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586576 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-cni-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586614 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-kubelet\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586642 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-systemd\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysctl-d\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586680 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sknnb\" (UniqueName: \"kubernetes.io/projected/ee2e975b-8948-45ed-9de6-345f4c54c29e-kube-api-access-sknnb\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586707 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-cni-bin\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586724 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovnkube-script-lib\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586738 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-system-cni-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-cnibin\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-sys-fs\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsv6l\" (UniqueName: \"kubernetes.io/projected/6db469f2-5afc-41c5-8338-9558deee2bd6-kube-api-access-qsv6l\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:41.587020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.587780 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586887 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-slash\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.587780 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586918 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee2e975b-8948-45ed-9de6-345f4c54c29e-host\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.587780 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586942 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-cni-multus\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.587780 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/205a6f27-fa4b-46d4-bf0b-d91aa8cf134c-agent-certs\") pod \"konnectivity-agent-xnxmv\" (UID: \"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c\") " pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.587780 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.586998 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-modprobe-d\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.613404 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.613371 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:40 +0000 UTC" deadline="2027-10-23 01:30:39.356151093 +0000 UTC" Apr 24 21:26:41.613404 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.613405 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13108h3m57.742750063s" Apr 24 21:26:41.648346 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.648317 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:41.687713 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.687681 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-env-overrides\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.687864 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.687716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-socket-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.687864 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.687740 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j57j\" (UniqueName: \"kubernetes.io/projected/6e0edc9b-0c6c-4c56-a267-47907a7053fd-kube-api-access-8j57j\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.687990 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.687901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-socket-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.688047 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-kubelet\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.688102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688065 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-kubelet\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.688102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:41.688232 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688232 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhknx\" (UniqueName: \"kubernetes.io/projected/1d60af74-5e4a-4c56-8738-0ea78867d785-kube-api-access-fhknx\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.688232 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-run\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.688232 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-lib-modules\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.688235 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95bfde24-898e-4ab6-9414-d93c895b9ba6-hosts-file\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/95bfde24-898e-4ab6-9414-d93c895b9ba6-tmp-dir\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.688325 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:42.188275362 +0000 UTC m=+3.036319488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tdnt\" (UniqueName: \"kubernetes.io/projected/95bfde24-898e-4ab6-9414-d93c895b9ba6-kube-api-access-9tdnt\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-daemon-config\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688387 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-multus-certs\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688409 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d60af74-5e4a-4c56-8738-0ea78867d785-host-slash\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.688459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-device-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-cni-netd\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovn-node-metrics-cert\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688536 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cnibin\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688549 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/95bfde24-898e-4ab6-9414-d93c895b9ba6-tmp-dir\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-cni-bin\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-lib-modules\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-etc-kubernetes\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688626 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-run\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95bfde24-898e-4ab6-9414-d93c895b9ba6-hosts-file\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688639 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/205a6f27-fa4b-46d4-bf0b-d91aa8cf134c-konnectivity-ca\") pod \"konnectivity-agent-xnxmv\" (UID: \"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c\") " pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-device-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-var-lib-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688698 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-cni-netd\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86t9n\" (UniqueName: \"kubernetes.io/projected/7f06c4fe-10e6-4600-864c-07dad67ed49f-kube-api-access-86t9n\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-sys\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-env-overrides\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.688974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-var-lib-kubelet\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-host\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e7352a7-e690-4558-a1e5-876926d3a57f-tmp\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cnibin\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-kubernetes\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688865 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee2e975b-8948-45ed-9de6-345f4c54c29e-serviceca\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f06c4fe-10e6-4600-864c-07dad67ed49f-cni-binary-copy\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-cni-bin\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-socket-dir-parent\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.688966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-log-socket\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1d60af74-5e4a-4c56-8738-0ea78867d785-iptables-alerter-script\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689011 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-systemd\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689058 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-hostroot\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689106 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/205a6f27-fa4b-46d4-bf0b-d91aa8cf134c-konnectivity-ca\") pod \"konnectivity-agent-xnxmv\" (UID: \"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c\") " pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.689799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-systemd-units\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-etc-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d60af74-5e4a-4c56-8738-0ea78867d785-host-slash\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-etc-kubernetes\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689275 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-socket-dir-parent\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-var-lib-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689500 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-kubernetes\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-hostroot\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee2e975b-8948-45ed-9de6-345f4c54c29e-serviceca\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.689964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-log-socket\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-host\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-systemd-units\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690215 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-var-lib-kubelet\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690284 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-multus-certs\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690432 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-systemd\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690573 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-sys\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.690637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690601 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-etc-openvswitch\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1d60af74-5e4a-4c56-8738-0ea78867d785-iptables-alerter-script\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovnkube-config\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.690979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-598ns\" (UniqueName: \"kubernetes.io/projected/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-kube-api-access-598ns\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-cni-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-daemon-config\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-kubelet\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-systemd\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691124 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovnkube-config\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysctl-d\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sknnb\" (UniqueName: \"kubernetes.io/projected/ee2e975b-8948-45ed-9de6-345f4c54c29e-kube-api-access-sknnb\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-kubelet\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-cni-bin\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691238 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.691479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovnkube-script-lib\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-system-cni-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-cnibin\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-sys-fs\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-systemd\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsv6l\" (UniqueName: \"kubernetes.io/projected/6db469f2-5afc-41c5-8338-9558deee2bd6-kube-api-access-qsv6l\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691414 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-slash\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691448 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee2e975b-8948-45ed-9de6-345f4c54c29e-host\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-cni-multus\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/205a6f27-fa4b-46d4-bf0b-d91aa8cf134c-agent-certs\") pod \"konnectivity-agent-xnxmv\" (UID: \"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c\") " pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691506 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-modprobe-d\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-os-release\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691538 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-netns\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-os-release\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.692264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-cni-bin\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysconfig\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691654 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-slash\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee2e975b-8948-45ed-9de6-345f4c54c29e-host\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-tuned\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjjd9\" (UniqueName: \"kubernetes.io/projected/1e7352a7-e690-4558-a1e5-876926d3a57f-kube-api-access-pjjd9\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691717 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysconfig\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-conf-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-conf-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691753 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691774 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-var-lib-cni-multus\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6fdk\" (UniqueName: \"kubernetes.io/projected/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-kube-api-access-r6fdk\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-k8s-cni-cncf-io\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysctl-d\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-ovn\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691843 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-modprobe-d\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-cnibin\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-system-cni-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-os-release\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-multus-cni-dir\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691909 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-netns\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691918 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-node-log\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-os-release\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691951 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.691967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-registration-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysctl-conf\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-registration-dir\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-run-netns\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692067 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-node-log\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692106 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-host-run-netns\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-sys-fs\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e0edc9b-0c6c-4c56-a267-47907a7053fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692202 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-sysctl-conf\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f06c4fe-10e6-4600-864c-07dad67ed49f-host-run-k8s-cni-cncf-io\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.693818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-run-ovn\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.694370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692870 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f06c4fe-10e6-4600-864c-07dad67ed49f-cni-binary-copy\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.694370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.692954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e7352a7-e690-4558-a1e5-876926d3a57f-tmp\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.694370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.693090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovn-node-metrics-cert\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.694370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.694359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-ovnkube-script-lib\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.694903 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.694881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/205a6f27-fa4b-46d4-bf0b-d91aa8cf134c-agent-certs\") pod \"konnectivity-agent-xnxmv\" (UID: \"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c\") " pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.695140 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.695120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1e7352a7-e690-4558-a1e5-876926d3a57f-etc-tuned\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.696689 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.696636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j57j\" (UniqueName: \"kubernetes.io/projected/6e0edc9b-0c6c-4c56-a267-47907a7053fd-kube-api-access-8j57j\") pod \"aws-ebs-csi-driver-node-fhdw4\" (UID: \"6e0edc9b-0c6c-4c56-a267-47907a7053fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.700374 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.700353 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:41.700480 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.700380 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:41.700480 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.700394 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58f4n for pod openshift-network-diagnostics/network-check-target-77b2w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:41.700480 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:41.700449 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n podName:9374c699-094f-4c29-9406-afbd076c9722 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:42.200432213 +0000 UTC m=+3.048476321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-58f4n" (UniqueName: "kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n") pod "network-check-target-77b2w" (UID: "9374c699-094f-4c29-9406-afbd076c9722") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:41.700658 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.700612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhknx\" (UniqueName: \"kubernetes.io/projected/1d60af74-5e4a-4c56-8738-0ea78867d785-kube-api-access-fhknx\") pod \"iptables-alerter-7wctt\" (UID: \"1d60af74-5e4a-4c56-8738-0ea78867d785\") " pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.700658 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.700622 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tdnt\" (UniqueName: \"kubernetes.io/projected/95bfde24-898e-4ab6-9414-d93c895b9ba6-kube-api-access-9tdnt\") pod \"node-resolver-gh9lb\" (UID: \"95bfde24-898e-4ab6-9414-d93c895b9ba6\") " pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.702331 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.702186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sknnb\" (UniqueName: \"kubernetes.io/projected/ee2e975b-8948-45ed-9de6-345f4c54c29e-kube-api-access-sknnb\") pod \"node-ca-ws882\" (UID: \"ee2e975b-8948-45ed-9de6-345f4c54c29e\") " pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.703044 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.702988 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86t9n\" (UniqueName: \"kubernetes.io/projected/7f06c4fe-10e6-4600-864c-07dad67ed49f-kube-api-access-86t9n\") pod \"multus-2gkkz\" (UID: \"7f06c4fe-10e6-4600-864c-07dad67ed49f\") " pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.703674 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.703578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6fdk\" (UniqueName: \"kubernetes.io/projected/98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6-kube-api-access-r6fdk\") pod \"ovnkube-node-k487s\" (UID: \"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.703674 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.703631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsv6l\" (UniqueName: \"kubernetes.io/projected/6db469f2-5afc-41c5-8338-9558deee2bd6-kube-api-access-qsv6l\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:41.704464 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.704444 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-598ns\" (UniqueName: \"kubernetes.io/projected/9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b-kube-api-access-598ns\") pod \"multus-additional-cni-plugins-phlsx\" (UID: \"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b\") " pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:41.709621 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.709600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjjd9\" (UniqueName: \"kubernetes.io/projected/1e7352a7-e690-4558-a1e5-876926d3a57f-kube-api-access-pjjd9\") pod \"tuned-jk496\" (UID: \"1e7352a7-e690-4558-a1e5-876926d3a57f\") " pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.853442 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.853354 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:41.871825 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.871794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:26:41.877586 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.877564 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gh9lb" Apr 24 21:26:41.886226 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.886209 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gkkz" Apr 24 21:26:41.890636 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.890614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7wctt" Apr 24 21:26:41.897336 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.897316 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" Apr 24 21:26:41.904852 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.904832 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:26:41.910405 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.910382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jk496" Apr 24 21:26:41.917886 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.917869 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ws882" Apr 24 21:26:41.922421 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:41.922402 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-phlsx" Apr 24 21:26:42.122120 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.121890 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7352a7_e690_4558_a1e5_876926d3a57f.slice/crio-cf9a72809ae4c76f2469f1172ce566f24113decc1cdc443e5c2fe38db2192ddb WatchSource:0}: Error finding container cf9a72809ae4c76f2469f1172ce566f24113decc1cdc443e5c2fe38db2192ddb: Status 404 returned error can't find the container with id cf9a72809ae4c76f2469f1172ce566f24113decc1cdc443e5c2fe38db2192ddb Apr 24 21:26:42.122925 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.122901 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd2f7a_2a8b_46c5_8a0b_35d6ebb920c6.slice/crio-df597a57f3c22194432f9cf07052acedfd5eab7904bb78458da56e9a88faef14 WatchSource:0}: Error finding container df597a57f3c22194432f9cf07052acedfd5eab7904bb78458da56e9a88faef14: Status 404 returned error can't find the container with id df597a57f3c22194432f9cf07052acedfd5eab7904bb78458da56e9a88faef14 Apr 24 21:26:42.124761 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.124733 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod205a6f27_fa4b_46d4_bf0b_d91aa8cf134c.slice/crio-d855a3e45475788c531202804c593cc282b90b4dcf7498518746121a4aabf021 WatchSource:0}: Error finding container d855a3e45475788c531202804c593cc282b90b4dcf7498518746121a4aabf021: Status 404 returned error can't find the container with id d855a3e45475788c531202804c593cc282b90b4dcf7498518746121a4aabf021 Apr 24 21:26:42.125678 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.125593 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0edc9b_0c6c_4c56_a267_47907a7053fd.slice/crio-c0aa1bc210ecc555a7617e2c431b7229c60b4be6be714916b2d87d1e5e4c9704 WatchSource:0}: Error finding container c0aa1bc210ecc555a7617e2c431b7229c60b4be6be714916b2d87d1e5e4c9704: Status 404 returned error can't find the container with id c0aa1bc210ecc555a7617e2c431b7229c60b4be6be714916b2d87d1e5e4c9704 Apr 24 21:26:42.128286 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.128191 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d60af74_5e4a_4c56_8738_0ea78867d785.slice/crio-4cc144d5e5ebb9487eb320553d8e679f1ec6526d677252db1d3579576a91632d WatchSource:0}: Error finding container 4cc144d5e5ebb9487eb320553d8e679f1ec6526d677252db1d3579576a91632d: Status 404 returned error can't find the container with id 4cc144d5e5ebb9487eb320553d8e679f1ec6526d677252db1d3579576a91632d Apr 24 21:26:42.130106 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.130081 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95bfde24_898e_4ab6_9414_d93c895b9ba6.slice/crio-9e6428c55a4142e36723e023c86c3dab85a3778f4457b4cdae5e381f778ead76 WatchSource:0}: Error finding container 9e6428c55a4142e36723e023c86c3dab85a3778f4457b4cdae5e381f778ead76: Status 404 returned error can't find the container with id 9e6428c55a4142e36723e023c86c3dab85a3778f4457b4cdae5e381f778ead76 Apr 24 21:26:42.131429 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.131405 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2e975b_8948_45ed_9de6_345f4c54c29e.slice/crio-aac72d491c5012f09b71df453e876669802331ae43a9f2fcfe9d93c861bf5d1e WatchSource:0}: Error finding container aac72d491c5012f09b71df453e876669802331ae43a9f2fcfe9d93c861bf5d1e: Status 404 returned error can't find the container with id aac72d491c5012f09b71df453e876669802331ae43a9f2fcfe9d93c861bf5d1e Apr 24 21:26:42.133478 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.133454 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f26a9e4_a21b_4fd3_a28f_f1eef2985f9b.slice/crio-393f53eee496512405532b08e4bd56a63c0925bb606db795298cb2f056be5a99 WatchSource:0}: Error finding container 393f53eee496512405532b08e4bd56a63c0925bb606db795298cb2f056be5a99: Status 404 returned error can't find the container with id 393f53eee496512405532b08e4bd56a63c0925bb606db795298cb2f056be5a99 Apr 24 21:26:42.134933 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:26:42.134612 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f06c4fe_10e6_4600_864c_07dad67ed49f.slice/crio-6940e2f9a0828e7b8c6bb3f80f04d8f9e1cf541b4c78d4f0a9b0d789c51a4811 WatchSource:0}: Error finding container 6940e2f9a0828e7b8c6bb3f80f04d8f9e1cf541b4c78d4f0a9b0d789c51a4811: Status 404 returned error can't find the container with id 6940e2f9a0828e7b8c6bb3f80f04d8f9e1cf541b4c78d4f0a9b0d789c51a4811 Apr 24 21:26:42.195947 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.195926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:42.196037 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:42.196026 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:42.196077 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:42.196072 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:43.196058758 +0000 UTC m=+4.044102852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:42.296851 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.296811 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:42.296997 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:42.296979 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:42.297054 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:42.297000 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:42.297054 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:42.297011 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58f4n for pod openshift-network-diagnostics/network-check-target-77b2w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:42.297119 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:42.297059 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n podName:9374c699-094f-4c29-9406-afbd076c9722 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:43.297044822 +0000 UTC m=+4.145088916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-58f4n" (UniqueName: "kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n") pod "network-check-target-77b2w" (UID: "9374c699-094f-4c29-9406-afbd076c9722") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:42.614134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.614069 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:40 +0000 UTC" deadline="2028-01-03 05:32:20.84189587 +0000 UTC" Apr 24 21:26:42.614134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.614102 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14840h5m38.227797194s" Apr 24 21:26:42.658407 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.657736 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" event={"ID":"fb850784b78b1ad1b82871c81c9b1b43","Type":"ContainerStarted","Data":"9aa5bdb6b4a296817cbb82e0a7845904c6237ed060d2ea710122fb4ac065f6f1"} Apr 24 21:26:42.668272 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.668175 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerStarted","Data":"393f53eee496512405532b08e4bd56a63c0925bb606db795298cb2f056be5a99"} Apr 24 21:26:42.679973 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.679819 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ws882" event={"ID":"ee2e975b-8948-45ed-9de6-345f4c54c29e","Type":"ContainerStarted","Data":"aac72d491c5012f09b71df453e876669802331ae43a9f2fcfe9d93c861bf5d1e"} Apr 24 21:26:42.690348 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.690317 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gh9lb" event={"ID":"95bfde24-898e-4ab6-9414-d93c895b9ba6","Type":"ContainerStarted","Data":"9e6428c55a4142e36723e023c86c3dab85a3778f4457b4cdae5e381f778ead76"} Apr 24 21:26:42.695095 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.695066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"df597a57f3c22194432f9cf07052acedfd5eab7904bb78458da56e9a88faef14"} Apr 24 21:26:42.698896 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.698866 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" event={"ID":"6e0edc9b-0c6c-4c56-a267-47907a7053fd","Type":"ContainerStarted","Data":"c0aa1bc210ecc555a7617e2c431b7229c60b4be6be714916b2d87d1e5e4c9704"} Apr 24 21:26:42.706086 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.706028 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xnxmv" event={"ID":"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c","Type":"ContainerStarted","Data":"d855a3e45475788c531202804c593cc282b90b4dcf7498518746121a4aabf021"} Apr 24 21:26:42.711981 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.711930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gkkz" event={"ID":"7f06c4fe-10e6-4600-864c-07dad67ed49f","Type":"ContainerStarted","Data":"6940e2f9a0828e7b8c6bb3f80f04d8f9e1cf541b4c78d4f0a9b0d789c51a4811"} Apr 24 21:26:42.717053 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.717027 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7wctt" event={"ID":"1d60af74-5e4a-4c56-8738-0ea78867d785","Type":"ContainerStarted","Data":"4cc144d5e5ebb9487eb320553d8e679f1ec6526d677252db1d3579576a91632d"} Apr 24 21:26:42.727311 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:42.722661 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jk496" event={"ID":"1e7352a7-e690-4558-a1e5-876926d3a57f","Type":"ContainerStarted","Data":"cf9a72809ae4c76f2469f1172ce566f24113decc1cdc443e5c2fe38db2192ddb"} Apr 24 21:26:43.202112 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:43.202050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:43.202317 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.202254 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:43.202439 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.202337 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:45.202319098 +0000 UTC m=+6.050363194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:43.302683 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:43.302649 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:43.302854 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.302836 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:43.302930 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.302861 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:43.302930 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.302875 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58f4n for pod openshift-network-diagnostics/network-check-target-77b2w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:43.302930 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.302926 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n podName:9374c699-094f-4c29-9406-afbd076c9722 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:45.30290714 +0000 UTC m=+6.150951256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-58f4n" (UniqueName: "kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n") pod "network-check-target-77b2w" (UID: "9374c699-094f-4c29-9406-afbd076c9722") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:43.644492 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:43.644458 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:43.644950 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.644598 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:43.645133 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:43.645112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:43.645234 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:43.645212 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:43.737270 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:43.736082 2571 generic.go:358] "Generic (PLEG): container finished" podID="76273891b5ff00ed3370baaa1995c530" containerID="43a44d8f80fae5b35170ff5f51cad56676d671c2ff200c22b70188035f3fba1b" exitCode=0 Apr 24 21:26:43.737270 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:43.736923 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" event={"ID":"76273891b5ff00ed3370baaa1995c530","Type":"ContainerDied","Data":"43a44d8f80fae5b35170ff5f51cad56676d671c2ff200c22b70188035f3fba1b"} Apr 24 21:26:43.750979 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:43.750482 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-5.ec2.internal" podStartSLOduration=3.75046427 podStartE2EDuration="3.75046427s" podCreationTimestamp="2026-04-24 21:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:42.670518406 +0000 UTC m=+3.518562523" watchObservedRunningTime="2026-04-24 21:26:43.75046427 +0000 UTC m=+4.598508388" Apr 24 21:26:44.742184 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:44.742130 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" event={"ID":"76273891b5ff00ed3370baaa1995c530","Type":"ContainerStarted","Data":"9abee5574dbd69d37fd22720d9edc74090a5d5f8eda1c534f425495572325481"} Apr 24 21:26:45.220987 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:45.220370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:45.220987 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.220525 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:45.220987 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.220592 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:49.220572018 +0000 UTC m=+10.068616117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:45.321856 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:45.321749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:45.322006 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.321944 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:45.322006 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.321973 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:45.322006 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.321988 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58f4n for pod openshift-network-diagnostics/network-check-target-77b2w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:45.322161 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.322048 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n podName:9374c699-094f-4c29-9406-afbd076c9722 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:49.322029694 +0000 UTC m=+10.170073803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-58f4n" (UniqueName: "kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n") pod "network-check-target-77b2w" (UID: "9374c699-094f-4c29-9406-afbd076c9722") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:45.649794 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:45.649762 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:45.649958 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.649897 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:45.650464 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:45.650431 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:45.650573 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:45.650539 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:47.644468 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:47.643899 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:47.644468 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:47.643899 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:47.644468 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:47.644045 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:47.644468 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:47.644112 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:49.255423 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:49.255383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:49.255874 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.255527 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:49.255874 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.255580 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:57.255564763 +0000 UTC m=+18.103608858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:49.356561 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:49.356521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:49.356733 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.356691 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:49.356733 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.356714 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:49.356733 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.356724 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58f4n for pod openshift-network-diagnostics/network-check-target-77b2w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:49.356882 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.356769 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n podName:9374c699-094f-4c29-9406-afbd076c9722 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:57.356753773 +0000 UTC m=+18.204797867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-58f4n" (UniqueName: "kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n") pod "network-check-target-77b2w" (UID: "9374c699-094f-4c29-9406-afbd076c9722") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:49.647580 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:49.646863 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:49.647580 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.646988 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:49.647580 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:49.647374 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:49.647580 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:49.647461 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:51.643887 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:51.643803 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:51.644355 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:51.643922 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:51.644355 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:51.644004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:51.644355 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:51.644141 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:53.643378 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:53.643342 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:53.643782 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:53.643382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:53.643782 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:53.643494 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:53.643782 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:53.643622 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:55.643696 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:55.643665 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:55.644092 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:55.643713 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:55.644092 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:55.643781 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:55.644092 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:55.643839 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:57.314366 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:57.314334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:57.314752 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.314456 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:57.314752 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.314509 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:13.314495214 +0000 UTC m=+34.162539307 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:57.415152 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:57.415114 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:57.415338 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.415318 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:57.415380 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.415344 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:57.415380 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.415358 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58f4n for pod openshift-network-diagnostics/network-check-target-77b2w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:57.415465 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.415420 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n podName:9374c699-094f-4c29-9406-afbd076c9722 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:13.415404354 +0000 UTC m=+34.263448452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-58f4n" (UniqueName: "kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n") pod "network-check-target-77b2w" (UID: "9374c699-094f-4c29-9406-afbd076c9722") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:57.643402 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:57.643319 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:57.643554 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:57.643324 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:57.643554 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.643441 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:57.643554 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:57.643516 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:59.644849 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.644678 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:26:59.645233 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.644748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:26:59.645233 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:59.644917 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:26:59.645233 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:26:59.645008 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:26:59.775007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.774980 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerStarted","Data":"3a0207ae5087149a892391cc3a338e5843ae9ab5f17d925ed4d5a19545dac8c2"} Apr 24 21:26:59.776799 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.776763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gh9lb" event={"ID":"95bfde24-898e-4ab6-9414-d93c895b9ba6","Type":"ContainerStarted","Data":"4d504de6a1c510a0904a596ab577a12be2c3119729b51c8bc05875db08118c5d"} Apr 24 21:26:59.778852 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.778825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xnxmv" event={"ID":"205a6f27-fa4b-46d4-bf0b-d91aa8cf134c","Type":"ContainerStarted","Data":"8221ce7a74c2188172b18692cda001ce1e3b41ce5a4ee90439f1151aa12cdd1d"} Apr 24 21:26:59.779907 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.779880 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gkkz" event={"ID":"7f06c4fe-10e6-4600-864c-07dad67ed49f","Type":"ContainerStarted","Data":"668a4dc48b7a8c1f4d299c41710a0d2d5044399768981623995fdc253364b21f"} Apr 24 21:26:59.781100 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.781078 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jk496" event={"ID":"1e7352a7-e690-4558-a1e5-876926d3a57f","Type":"ContainerStarted","Data":"cd93e73a8fc5b66bfe80d2112d6436b70abf3c67468708c9462dfdb9f02f1114"} Apr 24 21:26:59.799247 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.799203 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-5.ec2.internal" podStartSLOduration=19.799190525 podStartE2EDuration="19.799190525s" podCreationTimestamp="2026-04-24 21:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:44.756376871 +0000 UTC m=+5.604420988" watchObservedRunningTime="2026-04-24 21:26:59.799190525 +0000 UTC m=+20.647234641" Apr 24 21:26:59.828194 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.828076 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xnxmv" podStartSLOduration=7.996786553 podStartE2EDuration="20.828060188s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.127070521 +0000 UTC m=+2.975114615" lastFinishedPulling="2026-04-24 21:26:54.95834414 +0000 UTC m=+15.806388250" observedRunningTime="2026-04-24 21:26:59.827849089 +0000 UTC m=+20.675893205" watchObservedRunningTime="2026-04-24 21:26:59.828060188 +0000 UTC m=+20.676104305" Apr 24 21:26:59.828565 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.828517 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gh9lb" podStartSLOduration=3.585024087 podStartE2EDuration="20.828507911s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.133791553 +0000 UTC m=+2.981835662" lastFinishedPulling="2026-04-24 21:26:59.377275389 +0000 UTC m=+20.225319486" observedRunningTime="2026-04-24 21:26:59.812462747 +0000 UTC m=+20.660506865" watchObservedRunningTime="2026-04-24 21:26:59.828507911 +0000 UTC m=+20.676552027" Apr 24 21:26:59.847210 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.847177 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jk496" podStartSLOduration=3.638208412 podStartE2EDuration="20.847164368s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.123436973 +0000 UTC m=+2.971481070" lastFinishedPulling="2026-04-24 21:26:59.33239293 +0000 UTC m=+20.180437026" observedRunningTime="2026-04-24 21:26:59.846955441 +0000 UTC m=+20.694999556" watchObservedRunningTime="2026-04-24 21:26:59.847164368 +0000 UTC m=+20.695208484" Apr 24 21:26:59.896135 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:26:59.896113 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:27:00.785438 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.785217 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ws882" event={"ID":"ee2e975b-8948-45ed-9de6-345f4c54c29e","Type":"ContainerStarted","Data":"e05360cc93067d751d08c279d07837f7bd8b2fccaacc1682e028f8bcdf0ae27c"} Apr 24 21:27:00.788563 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.788531 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"028edca5ebdd10effeb1f11ccd9fe21ca2ecc8c1a724e7eb017766d6e9e50243"} Apr 24 21:27:00.788563 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.788561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"84375ab5ebed4fc009ecdd75cc76d982424290ad2d801b2741c0d5584f5d6ed0"} Apr 24 21:27:00.788707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.788570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"c8abef34a60f5dd3d6669dd3c8a9335a7d03db0e18e87cdee178028f007306f4"} Apr 24 21:27:00.788707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.788578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"ff17929d7633cdd3fdfee4f977b63d9da5f13c49f57ad4f9594ec85dab1a6e5b"} Apr 24 21:27:00.788707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.788586 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"78b1ae37de417ae146dd0dec1c11506f1402e23e2ef934458728b90d407be63e"} Apr 24 21:27:00.788707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.788594 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"5896b69965c7c351d79d54e281cd9e48bc4cc1aab51be41ed06e8777a57e994a"} Apr 24 21:27:00.790200 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.790177 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" event={"ID":"6e0edc9b-0c6c-4c56-a267-47907a7053fd","Type":"ContainerStarted","Data":"7a5d90094edecadddb6a0ef9a8a9f8308b17277135bfaa8bcb615c496b7416bf"} Apr 24 21:27:00.791801 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.791775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7wctt" event={"ID":"1d60af74-5e4a-4c56-8738-0ea78867d785","Type":"ContainerStarted","Data":"3ab4325e4efa21f6d846ada3d68df6d55c06bec2ac7485f488684a9f4f3ae792"} Apr 24 21:27:00.793629 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.793605 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b" containerID="3a0207ae5087149a892391cc3a338e5843ae9ab5f17d925ed4d5a19545dac8c2" exitCode=0 Apr 24 21:27:00.793717 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.793704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerDied","Data":"3a0207ae5087149a892391cc3a338e5843ae9ab5f17d925ed4d5a19545dac8c2"} Apr 24 21:27:00.805323 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.805262 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ws882" podStartSLOduration=4.561364688 podStartE2EDuration="21.805248364s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.135561282 +0000 UTC m=+2.983605376" lastFinishedPulling="2026-04-24 21:26:59.379444948 +0000 UTC m=+20.227489052" observedRunningTime="2026-04-24 21:27:00.804703741 +0000 UTC m=+21.652747854" watchObservedRunningTime="2026-04-24 21:27:00.805248364 +0000 UTC m=+21.653292481" Apr 24 21:27:00.819224 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.819188 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7wctt" podStartSLOduration=4.572559513 podStartE2EDuration="21.819179274s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.13039337 +0000 UTC m=+2.978437485" lastFinishedPulling="2026-04-24 21:26:59.377013137 +0000 UTC m=+20.225057246" observedRunningTime="2026-04-24 21:27:00.81890327 +0000 UTC m=+21.666947397" watchObservedRunningTime="2026-04-24 21:27:00.819179274 +0000 UTC m=+21.667223391" Apr 24 21:27:00.875519 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:00.875494 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:01.643890 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.643516 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:01.643890 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:01.643633 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:27:01.643890 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.643660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:01.643890 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:01.643754 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:27:01.645806 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.645718 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:00.875518116Z","UUID":"4634728b-7eed-403d-a2d5-a6dcadad0fda","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:01.651169 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.651140 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:01.651264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.651182 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:01.797589 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.797533 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" event={"ID":"6e0edc9b-0c6c-4c56-a267-47907a7053fd","Type":"ContainerStarted","Data":"b711bee8d73f737d3c38aa728e9a33d658fd73bb6d8647d12a72ad1e4603be3e"} Apr 24 21:27:01.797589 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.797593 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" event={"ID":"6e0edc9b-0c6c-4c56-a267-47907a7053fd","Type":"ContainerStarted","Data":"b7d146a13a2e5266d3d77044c168b22c52dfcccbd6d5bd7b53a2796860e9caf8"} Apr 24 21:27:01.823614 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.823565 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fhdw4" podStartSLOduration=3.316600171 podStartE2EDuration="22.823550305s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.128611319 +0000 UTC m=+2.976655419" lastFinishedPulling="2026-04-24 21:27:01.635561445 +0000 UTC m=+22.483605553" observedRunningTime="2026-04-24 21:27:01.822985824 +0000 UTC m=+22.671029952" watchObservedRunningTime="2026-04-24 21:27:01.823550305 +0000 UTC m=+22.671594421" Apr 24 21:27:01.823782 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:01.823714 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2gkkz" podStartSLOduration=5.553341407 podStartE2EDuration="22.823708692s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.136209322 +0000 UTC m=+2.984253415" lastFinishedPulling="2026-04-24 21:26:59.406576601 +0000 UTC m=+20.254620700" observedRunningTime="2026-04-24 21:27:00.869465873 +0000 UTC m=+21.717509999" watchObservedRunningTime="2026-04-24 21:27:01.823708692 +0000 UTC m=+22.671752874" Apr 24 21:27:02.802576 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:02.802390 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"c8e9d414e2fab6a1da518749f3498ee93907b3f74ea11809caa15092a467892b"} Apr 24 21:27:03.644232 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:03.644203 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:03.644232 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:03.644230 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:03.644517 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:03.644346 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:27:03.644517 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:03.644472 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:27:03.744849 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:03.744805 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:27:03.745466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:03.745438 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:27:03.804846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:03.804820 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xnxmv" Apr 24 21:27:04.813748 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:04.813631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" event={"ID":"98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6","Type":"ContainerStarted","Data":"f2eab5c629b1f9d5c6b5a4e9fd9cc472d792f213dca3768a0f9e6f89c56981a8"} Apr 24 21:27:04.814865 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:04.813995 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:27:04.814865 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:04.814055 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:27:04.814865 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:04.814070 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:27:04.828852 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:04.828822 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:27:04.829220 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:04.829206 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:27:04.843060 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:04.843024 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" podStartSLOduration=8.138149019 podStartE2EDuration="25.843013711s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.124866463 +0000 UTC m=+2.972910557" lastFinishedPulling="2026-04-24 21:26:59.829731156 +0000 UTC m=+20.677775249" observedRunningTime="2026-04-24 21:27:04.842623539 +0000 UTC m=+25.690667667" watchObservedRunningTime="2026-04-24 21:27:04.843013711 +0000 UTC m=+25.691057827" Apr 24 21:27:05.644456 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:05.644252 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:05.644715 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:05.644263 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:05.644715 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:05.644555 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:27:05.644715 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:05.644608 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:27:05.817090 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:05.817056 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b" containerID="5bf10d8cac31ee70af1947c182edc88cb839482b2916ec4a8d0b294c3523aaff" exitCode=0 Apr 24 21:27:05.817554 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:05.817130 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerDied","Data":"5bf10d8cac31ee70af1947c182edc88cb839482b2916ec4a8d0b294c3523aaff"} Apr 24 21:27:06.677621 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:06.677592 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-77b2w"] Apr 24 21:27:06.677730 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:06.677706 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:06.677842 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:06.677809 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:27:06.679406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:06.679386 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-npqvg"] Apr 24 21:27:06.679501 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:06.679456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:06.679546 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:06.679527 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:27:07.825693 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:07.825431 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b" containerID="9c20330fdb7c3f9db0e1f3dd9ae64ca39f93275193da68aa6ac08de3619a631b" exitCode=0 Apr 24 21:27:07.826072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:07.825514 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerDied","Data":"9c20330fdb7c3f9db0e1f3dd9ae64ca39f93275193da68aa6ac08de3619a631b"} Apr 24 21:27:08.644309 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:08.644275 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:08.644408 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:08.644323 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:08.644408 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:08.644399 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:27:08.644562 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:08.644540 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:27:08.829201 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:08.829169 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b" containerID="08faa90bc562e73b56abd4c3605a5b71afd9b043bb79a876c08747cc550a32cb" exitCode=0 Apr 24 21:27:08.829592 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:08.829221 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerDied","Data":"08faa90bc562e73b56abd4c3605a5b71afd9b043bb79a876c08747cc550a32cb"} Apr 24 21:27:10.643385 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:10.643349 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:10.643385 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:10.643401 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:10.644202 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:10.643521 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:27:10.644202 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:10.643636 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:27:12.644229 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:12.644201 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:12.644729 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:12.644200 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:12.644729 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:12.644360 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:27:12.644729 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:12.644397 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77b2w" podUID="9374c699-094f-4c29-9406-afbd076c9722" Apr 24 21:27:13.328023 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.327990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:13.328201 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.328102 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:13.328201 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.328157 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:45.328140099 +0000 UTC m=+66.176184206 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:13.429011 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.428931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:13.429143 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.429093 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:13.429143 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.429112 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:13.429143 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.429121 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58f4n for pod openshift-network-diagnostics/network-check-target-77b2w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:13.429255 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.429177 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n podName:9374c699-094f-4c29-9406-afbd076c9722 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:45.429160199 +0000 UTC m=+66.277204313 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-58f4n" (UniqueName: "kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n") pod "network-check-target-77b2w" (UID: "9374c699-094f-4c29-9406-afbd076c9722") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:13.489838 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.489806 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-5.ec2.internal" event="NodeReady" Apr 24 21:27:13.489998 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.489957 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:13.539512 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.539480 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pjmg6"] Apr 24 21:27:13.580959 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.580932 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hmts4"] Apr 24 21:27:13.581111 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.581086 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:13.584123 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.584097 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:13.584269 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.584221 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:13.584369 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.584352 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:13.584424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.584107 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv4ts\"" Apr 24 21:27:13.593419 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.593396 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pjmg6"] Apr 24 21:27:13.593580 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.593425 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hmts4"] Apr 24 21:27:13.593580 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.593520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.597169 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.597146 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:13.597425 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.597393 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zkwd8\"" Apr 24 21:27:13.597637 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.597619 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:13.732698 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.732596 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d586c6-47b1-4dc6-96e0-7dac12734909-config-volume\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.732698 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.732685 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k87mw\" (UniqueName: \"kubernetes.io/projected/3d01f7a9-76ee-487e-9801-6c420df8721a-kube-api-access-k87mw\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:13.733145 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.732717 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07d586c6-47b1-4dc6-96e0-7dac12734909-tmp-dir\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.733145 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.732744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.733145 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.732843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:13.733145 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.732881 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdsl5\" (UniqueName: \"kubernetes.io/projected/07d586c6-47b1-4dc6-96e0-7dac12734909-kube-api-access-gdsl5\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.834187 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834080 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.834187 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.834183 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:13.834462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834244 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:13.834462 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.834265 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:14.334245585 +0000 UTC m=+35.182289682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:27:13.834462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdsl5\" (UniqueName: \"kubernetes.io/projected/07d586c6-47b1-4dc6-96e0-7dac12734909-kube-api-access-gdsl5\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.834462 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.834348 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:13.834462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834363 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d586c6-47b1-4dc6-96e0-7dac12734909-config-volume\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.834462 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:13.834395 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:14.334378943 +0000 UTC m=+35.182423040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:27:13.834462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k87mw\" (UniqueName: \"kubernetes.io/projected/3d01f7a9-76ee-487e-9801-6c420df8721a-kube-api-access-k87mw\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:13.834804 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07d586c6-47b1-4dc6-96e0-7dac12734909-tmp-dir\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.834804 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07d586c6-47b1-4dc6-96e0-7dac12734909-tmp-dir\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.834880 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.834835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d586c6-47b1-4dc6-96e0-7dac12734909-config-volume\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.855105 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.847222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdsl5\" (UniqueName: \"kubernetes.io/projected/07d586c6-47b1-4dc6-96e0-7dac12734909-kube-api-access-gdsl5\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:13.855105 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:13.847350 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k87mw\" (UniqueName: \"kubernetes.io/projected/3d01f7a9-76ee-487e-9801-6c420df8721a-kube-api-access-k87mw\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:14.339458 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.339421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:14.339643 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.339464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:14.339643 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:14.339586 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:14.339643 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:14.339595 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:14.339643 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:14.339644 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.339630084 +0000 UTC m=+36.187674178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:27:14.339852 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:14.339656 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.339650568 +0000 UTC m=+36.187694662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:27:14.643381 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.643317 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:14.643381 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.643343 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:14.646160 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.646135 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:14.647249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.647230 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:14.647351 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.647230 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vlmsd\"" Apr 24 21:27:14.647351 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.647309 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9whs\"" Apr 24 21:27:14.667336 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:14.667313 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:15.347705 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:15.347526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:15.348053 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:15.347727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:15.348053 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:15.347670 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:15.348053 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:15.347812 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:15.348053 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:15.347857 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:17.347837152 +0000 UTC m=+38.195881258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:27:15.348053 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:15.347873 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:17.34786687 +0000 UTC m=+38.195910963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:27:15.843495 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:15.843459 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b" containerID="81b8be15f90077033f0b40b5b0456ed81390feebe777b1500ef051486b59bf29" exitCode=0 Apr 24 21:27:15.843684 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:15.843502 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerDied","Data":"81b8be15f90077033f0b40b5b0456ed81390feebe777b1500ef051486b59bf29"} Apr 24 21:27:16.848358 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:16.848327 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b" containerID="d61b912deabe28d17308624d9cbaa2a997e7858f8265f8c0e03277bc2db39c62" exitCode=0 Apr 24 21:27:16.848713 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:16.848385 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerDied","Data":"d61b912deabe28d17308624d9cbaa2a997e7858f8265f8c0e03277bc2db39c62"} Apr 24 21:27:17.364166 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:17.364130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:17.364382 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:17.364179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:17.364382 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:17.364274 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:17.364382 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:17.364344 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:17.364382 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:17.364354 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:21.36433844 +0000 UTC m=+42.212382535 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:27:17.364546 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:17.364395 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:21.364377287 +0000 UTC m=+42.212421384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:27:17.852750 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:17.852717 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phlsx" event={"ID":"9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b","Type":"ContainerStarted","Data":"483d27c1df6c59bc88da255961e4e448d44b259697e1e9dc02d18cac575792fa"} Apr 24 21:27:17.878730 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:17.878677 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-phlsx" podStartSLOduration=6.207479693 podStartE2EDuration="38.878661109s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:26:42.136222793 +0000 UTC m=+2.984266905" lastFinishedPulling="2026-04-24 21:27:14.807404227 +0000 UTC m=+35.655448321" observedRunningTime="2026-04-24 21:27:17.877020726 +0000 UTC m=+38.725064842" watchObservedRunningTime="2026-04-24 21:27:17.878661109 +0000 UTC m=+38.726705226" Apr 24 21:27:21.391277 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:21.391242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:21.391719 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:21.391383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:21.391719 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:21.391405 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:21.391719 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:21.391470 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.391453729 +0000 UTC m=+50.239497823 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:27:21.391719 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:21.391497 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:21.391719 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:21.391546 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.391530655 +0000 UTC m=+50.239574752 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:27:25.331459 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.331425 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x"] Apr 24 21:27:25.383653 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.383629 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x"] Apr 24 21:27:25.383817 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.383755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.395780 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.395760 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:25.396810 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.396785 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:27:25.396871 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.396785 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.397877 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.397863 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.414169 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.414146 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b"] Apr 24 21:27:25.440541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.440519 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b"] Apr 24 21:27:25.440640 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.440627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.444544 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.444529 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:27:25.445035 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.445004 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:27:25.445282 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.445265 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:27:25.445942 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.445899 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:27:25.520085 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.520055 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/78b91d10-b1cb-49df-9404-87640d341585-klusterlet-config\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.520217 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.520098 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78b91d10-b1cb-49df-9404-87640d341585-tmp\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.520217 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.520117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j2mr\" (UniqueName: \"kubernetes.io/projected/78b91d10-b1cb-49df-9404-87640d341585-kube-api-access-9j2mr\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.620876 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.620800 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.620876 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.620840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.620876 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.620870 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.621067 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.620947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmpz4\" (UniqueName: \"kubernetes.io/projected/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-kube-api-access-bmpz4\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.621067 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.620992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/78b91d10-b1cb-49df-9404-87640d341585-klusterlet-config\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.621067 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.621037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-ca\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.621176 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.621078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78b91d10-b1cb-49df-9404-87640d341585-tmp\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.621176 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.621095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-hub\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.621176 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.621127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j2mr\" (UniqueName: \"kubernetes.io/projected/78b91d10-b1cb-49df-9404-87640d341585-kube-api-access-9j2mr\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.621434 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.621419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78b91d10-b1cb-49df-9404-87640d341585-tmp\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.624622 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.624604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/78b91d10-b1cb-49df-9404-87640d341585-klusterlet-config\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.634703 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.634677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j2mr\" (UniqueName: \"kubernetes.io/projected/78b91d10-b1cb-49df-9404-87640d341585-kube-api-access-9j2mr\") pod \"klusterlet-addon-workmgr-848db9d749-mxw4x\" (UID: \"78b91d10-b1cb-49df-9404-87640d341585\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.692641 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.692610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:25.722415 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.722388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-ca\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.722535 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.722436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-hub\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.722535 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.722473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.722535 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.722514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.722692 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.722567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.722692 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.722612 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmpz4\" (UniqueName: \"kubernetes.io/projected/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-kube-api-access-bmpz4\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.723346 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.723274 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.725030 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.725002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-ca\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.725183 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.725057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.725556 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.725537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.725673 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.725652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-hub\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.743680 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.743657 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmpz4\" (UniqueName: \"kubernetes.io/projected/0dc553d9-14cc-4d0f-8470-a69eed61b6b2-kube-api-access-bmpz4\") pod \"cluster-proxy-proxy-agent-6b87c788b4-dks6b\" (UID: \"0dc553d9-14cc-4d0f-8470-a69eed61b6b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.780861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.780828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:27:25.855980 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.855947 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x"] Apr 24 21:27:25.860187 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:27:25.860152 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b91d10_b1cb_49df_9404_87640d341585.slice/crio-4b9536498a0eee7d835200cc1bfde30cb564e6e2d2e99c52bd8669b9e60d5cbc WatchSource:0}: Error finding container 4b9536498a0eee7d835200cc1bfde30cb564e6e2d2e99c52bd8669b9e60d5cbc: Status 404 returned error can't find the container with id 4b9536498a0eee7d835200cc1bfde30cb564e6e2d2e99c52bd8669b9e60d5cbc Apr 24 21:27:25.870256 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.870228 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" event={"ID":"78b91d10-b1cb-49df-9404-87640d341585","Type":"ContainerStarted","Data":"4b9536498a0eee7d835200cc1bfde30cb564e6e2d2e99c52bd8669b9e60d5cbc"} Apr 24 21:27:25.912161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:25.912098 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b"] Apr 24 21:27:25.914565 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:27:25.914532 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dc553d9_14cc_4d0f_8470_a69eed61b6b2.slice/crio-fdcad3502694978ccbdad90587720606ccfb34b37e62355a9a4b650896599d9d WatchSource:0}: Error finding container fdcad3502694978ccbdad90587720606ccfb34b37e62355a9a4b650896599d9d: Status 404 returned error can't find the container with id fdcad3502694978ccbdad90587720606ccfb34b37e62355a9a4b650896599d9d Apr 24 21:27:26.873269 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:26.873226 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" event={"ID":"0dc553d9-14cc-4d0f-8470-a69eed61b6b2","Type":"ContainerStarted","Data":"fdcad3502694978ccbdad90587720606ccfb34b37e62355a9a4b650896599d9d"} Apr 24 21:27:29.453842 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:29.453806 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:29.454272 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:29.453919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:29.454272 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:29.453991 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:29.454272 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:29.454062 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:29.454272 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:29.454069 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:45.454050587 +0000 UTC m=+66.302094700 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:27:29.454272 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:29.454114 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:45.454098225 +0000 UTC m=+66.302142325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:27:29.880559 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:29.880521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" event={"ID":"78b91d10-b1cb-49df-9404-87640d341585","Type":"ContainerStarted","Data":"1c042fda3a3518d92c50f4dabe12e7a56ba978bd5bc7f3d749753643e65a3a56"} Apr 24 21:27:29.880812 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:29.880782 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:29.882426 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:29.882400 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:27:29.900592 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:29.900539 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" podStartSLOduration=1.493195523 podStartE2EDuration="4.90052265s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:25.86243852 +0000 UTC m=+46.710482620" lastFinishedPulling="2026-04-24 21:27:29.269765643 +0000 UTC m=+50.117809747" observedRunningTime="2026-04-24 21:27:29.899502527 +0000 UTC m=+50.747546644" watchObservedRunningTime="2026-04-24 21:27:29.90052265 +0000 UTC m=+50.748566766" Apr 24 21:27:30.883898 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:30.883867 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" event={"ID":"0dc553d9-14cc-4d0f-8470-a69eed61b6b2","Type":"ContainerStarted","Data":"94556312d87a740511c771353899f85c2a95f6078254c989453d37d92e89500e"} Apr 24 21:27:32.889152 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:32.889117 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" event={"ID":"0dc553d9-14cc-4d0f-8470-a69eed61b6b2","Type":"ContainerStarted","Data":"206af984808a3d0e4b417e3687500e2b23aa1e47a73c5cae16ca24921456b700"} Apr 24 21:27:32.889152 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:32.889152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" event={"ID":"0dc553d9-14cc-4d0f-8470-a69eed61b6b2","Type":"ContainerStarted","Data":"f986ee9a5ab3a17830953d10922b9eba8e85d7f9fe68914b430ae07771539412"} Apr 24 21:27:32.910698 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:32.910654 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" podStartSLOduration=1.36614338 podStartE2EDuration="7.910640994s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:25.916031457 +0000 UTC m=+46.764075550" lastFinishedPulling="2026-04-24 21:27:32.460529067 +0000 UTC m=+53.308573164" observedRunningTime="2026-04-24 21:27:32.909057614 +0000 UTC m=+53.757101787" watchObservedRunningTime="2026-04-24 21:27:32.910640994 +0000 UTC m=+53.758685149" Apr 24 21:27:36.832592 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:36.832560 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k487s" Apr 24 21:27:45.369056 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.369007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:27:45.372047 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.372026 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:45.379355 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:45.379331 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:45.379454 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:45.379399 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:49.379379212 +0000 UTC m=+130.227423305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : secret "metrics-daemon-secret" not found Apr 24 21:27:45.469910 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.469872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:27:45.469910 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.469919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:27:45.470069 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:45.470005 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:45.470069 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:45.470010 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:45.470069 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:45.470055 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.470043557 +0000 UTC m=+98.318087650 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:27:45.470069 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.470055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:45.470202 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:27:45.470072 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.470060166 +0000 UTC m=+98.318104260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:27:45.472573 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.472557 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:45.483003 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.482982 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:45.493653 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.493626 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58f4n\" (UniqueName: \"kubernetes.io/projected/9374c699-094f-4c29-9406-afbd076c9722-kube-api-access-58f4n\") pod \"network-check-target-77b2w\" (UID: \"9374c699-094f-4c29-9406-afbd076c9722\") " pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:45.560518 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.560496 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9whs\"" Apr 24 21:27:45.568658 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.568644 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:45.679778 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.679748 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-77b2w"] Apr 24 21:27:45.682244 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:27:45.682216 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9374c699_094f_4c29_9406_afbd076c9722.slice/crio-2bdfb32e9122927540a3a5868126783b793003f01eaa920a58d605919a1e8790 WatchSource:0}: Error finding container 2bdfb32e9122927540a3a5868126783b793003f01eaa920a58d605919a1e8790: Status 404 returned error can't find the container with id 2bdfb32e9122927540a3a5868126783b793003f01eaa920a58d605919a1e8790 Apr 24 21:27:45.916695 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:45.916601 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-77b2w" event={"ID":"9374c699-094f-4c29-9406-afbd076c9722","Type":"ContainerStarted","Data":"2bdfb32e9122927540a3a5868126783b793003f01eaa920a58d605919a1e8790"} Apr 24 21:27:48.925205 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:48.925170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-77b2w" event={"ID":"9374c699-094f-4c29-9406-afbd076c9722","Type":"ContainerStarted","Data":"39e117605e58c2327848e91cf745f651ca9b008910bfa2f6956fd096ef70752a"} Apr 24 21:27:48.925644 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:48.925342 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:27:48.946117 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:27:48.946067 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-77b2w" podStartSLOduration=66.930924528 podStartE2EDuration="1m9.946052776s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:27:45.684011146 +0000 UTC m=+66.532055241" lastFinishedPulling="2026-04-24 21:27:48.699139395 +0000 UTC m=+69.547183489" observedRunningTime="2026-04-24 21:27:48.945480901 +0000 UTC m=+69.793525017" watchObservedRunningTime="2026-04-24 21:27:48.946052776 +0000 UTC m=+69.794096886" Apr 24 21:28:17.489376 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:28:17.489288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:28:17.489376 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:28:17.489391 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:28:17.489828 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:28:17.489410 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:17.489828 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:28:17.489472 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:17.489828 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:28:17.489475 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls podName:07d586c6-47b1-4dc6-96e0-7dac12734909 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:21.489456396 +0000 UTC m=+162.337500489 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls") pod "dns-default-hmts4" (UID: "07d586c6-47b1-4dc6-96e0-7dac12734909") : secret "dns-default-metrics-tls" not found Apr 24 21:28:17.489828 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:28:17.489529 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert podName:3d01f7a9-76ee-487e-9801-6c420df8721a nodeName:}" failed. No retries permitted until 2026-04-24 21:29:21.489516913 +0000 UTC m=+162.337561006 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert") pod "ingress-canary-pjmg6" (UID: "3d01f7a9-76ee-487e-9801-6c420df8721a") : secret "canary-serving-cert" not found Apr 24 21:28:19.929378 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:28:19.929345 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-77b2w" Apr 24 21:28:49.401707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:28:49.401673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:28:49.402144 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:28:49.401814 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:49.402144 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:28:49.401886 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs podName:6db469f2-5afc-41c5-8338-9558deee2bd6 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:51.401868335 +0000 UTC m=+252.249912432 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs") pod "network-metrics-daemon-npqvg" (UID: "6db469f2-5afc-41c5-8338-9558deee2bd6") : secret "metrics-daemon-secret" not found Apr 24 21:29:00.060252 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:00.060222 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gh9lb_95bfde24-898e-4ab6-9414-d93c895b9ba6/dns-node-resolver/0.log" Apr 24 21:29:01.259953 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:01.259929 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ws882_ee2e975b-8948-45ed-9de6-345f4c54c29e/node-ca/0.log" Apr 24 21:29:16.593855 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:29:16.593766 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pjmg6" podUID="3d01f7a9-76ee-487e-9801-6c420df8721a" Apr 24 21:29:16.611708 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:29:16.611681 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hmts4" podUID="07d586c6-47b1-4dc6-96e0-7dac12734909" Apr 24 21:29:17.093101 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:17.093072 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hmts4" Apr 24 21:29:17.093101 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:17.093106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:29:17.652904 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:29:17.652860 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-npqvg" podUID="6db469f2-5afc-41c5-8338-9558deee2bd6" Apr 24 21:29:21.527037 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.526983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:29:21.527037 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.527050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:29:21.529621 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.529588 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07d586c6-47b1-4dc6-96e0-7dac12734909-metrics-tls\") pod \"dns-default-hmts4\" (UID: \"07d586c6-47b1-4dc6-96e0-7dac12734909\") " pod="openshift-dns/dns-default-hmts4" Apr 24 21:29:21.529738 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.529655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d01f7a9-76ee-487e-9801-6c420df8721a-cert\") pod \"ingress-canary-pjmg6\" (UID: \"3d01f7a9-76ee-487e-9801-6c420df8721a\") " pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:29:21.600541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.600508 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zkwd8\"" Apr 24 21:29:21.600541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.600508 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv4ts\"" Apr 24 21:29:21.604702 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.604684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hmts4" Apr 24 21:29:21.604793 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.604759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pjmg6" Apr 24 21:29:21.778592 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.778513 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pjmg6"] Apr 24 21:29:21.782065 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:29:21.782038 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d01f7a9_76ee_487e_9801_6c420df8721a.slice/crio-93bc56779ee293f1db125a48f2051278fb73a8c2ab673b22e37ed8511cf25579 WatchSource:0}: Error finding container 93bc56779ee293f1db125a48f2051278fb73a8c2ab673b22e37ed8511cf25579: Status 404 returned error can't find the container with id 93bc56779ee293f1db125a48f2051278fb73a8c2ab673b22e37ed8511cf25579 Apr 24 21:29:21.787413 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.787370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hmts4"] Apr 24 21:29:21.790255 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:29:21.790233 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d586c6_47b1_4dc6_96e0_7dac12734909.slice/crio-e2fd093ac3a51ef1214ad2d96faf1f5f012bcd5b917196a7503a804e58e16a23 WatchSource:0}: Error finding container e2fd093ac3a51ef1214ad2d96faf1f5f012bcd5b917196a7503a804e58e16a23: Status 404 returned error can't find the container with id e2fd093ac3a51ef1214ad2d96faf1f5f012bcd5b917196a7503a804e58e16a23 Apr 24 21:29:21.949284 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.949251 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8kgzf"] Apr 24 21:29:21.953702 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.953685 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:21.958775 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.958735 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:29:21.958775 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.958754 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:29:21.958775 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.958773 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:29:21.959204 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.958812 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sdl4k\"" Apr 24 21:29:21.959204 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.958755 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:29:21.992944 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:21.992892 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8kgzf"] Apr 24 21:29:22.026847 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.026823 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6bff679f46-648dd"] Apr 24 21:29:22.029494 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.029456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.032739 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.032719 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:29:22.033062 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.033049 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:29:22.039847 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.039830 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:29:22.040749 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.040735 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qvff9\"" Apr 24 21:29:22.040945 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.040931 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:29:22.066932 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.066909 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6bff679f46-648dd"] Apr 24 21:29:22.103348 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.103318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pjmg6" event={"ID":"3d01f7a9-76ee-487e-9801-6c420df8721a","Type":"ContainerStarted","Data":"93bc56779ee293f1db125a48f2051278fb73a8c2ab673b22e37ed8511cf25579"} Apr 24 21:29:22.104126 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.104103 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hmts4" event={"ID":"07d586c6-47b1-4dc6-96e0-7dac12734909","Type":"ContainerStarted","Data":"e2fd093ac3a51ef1214ad2d96faf1f5f012bcd5b917196a7503a804e58e16a23"} Apr 24 21:29:22.131421 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131394 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50557aa9-2f82-4792-88a0-6dca21949f46-image-registry-private-configuration\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.131508 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50557aa9-2f82-4792-88a0-6dca21949f46-installation-pull-secrets\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.131508 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131486 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c334feeb-a7fc-4d45-ad79-e67520d0cd94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.131597 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131529 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c334feeb-a7fc-4d45-ad79-e67520d0cd94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.131641 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c334feeb-a7fc-4d45-ad79-e67520d0cd94-crio-socket\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.131641 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvh96\" (UniqueName: \"kubernetes.io/projected/c334feeb-a7fc-4d45-ad79-e67520d0cd94-kube-api-access-cvh96\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.131641 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50557aa9-2f82-4792-88a0-6dca21949f46-trusted-ca\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.131744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50557aa9-2f82-4792-88a0-6dca21949f46-ca-trust-extracted\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.131744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131706 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50557aa9-2f82-4792-88a0-6dca21949f46-registry-certificates\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.131744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-bound-sa-token\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.131744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flcdf\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-kube-api-access-flcdf\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.131863 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c334feeb-a7fc-4d45-ad79-e67520d0cd94-data-volume\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.131863 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.131819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-registry-tls\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.232416 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c334feeb-a7fc-4d45-ad79-e67520d0cd94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.232416 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c334feeb-a7fc-4d45-ad79-e67520d0cd94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c334feeb-a7fc-4d45-ad79-e67520d0cd94-crio-socket\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvh96\" (UniqueName: \"kubernetes.io/projected/c334feeb-a7fc-4d45-ad79-e67520d0cd94-kube-api-access-cvh96\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50557aa9-2f82-4792-88a0-6dca21949f46-trusted-ca\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232563 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50557aa9-2f82-4792-88a0-6dca21949f46-ca-trust-extracted\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50557aa9-2f82-4792-88a0-6dca21949f46-registry-certificates\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-bound-sa-token\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c334feeb-a7fc-4d45-ad79-e67520d0cd94-crio-socket\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.232631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flcdf\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-kube-api-access-flcdf\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.233034 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c334feeb-a7fc-4d45-ad79-e67520d0cd94-data-volume\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.233034 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-registry-tls\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.233034 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50557aa9-2f82-4792-88a0-6dca21949f46-image-registry-private-configuration\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.233034 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.232850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50557aa9-2f82-4792-88a0-6dca21949f46-installation-pull-secrets\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.233228 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.233044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c334feeb-a7fc-4d45-ad79-e67520d0cd94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.233390 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.233365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c334feeb-a7fc-4d45-ad79-e67520d0cd94-data-volume\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.233595 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.233568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50557aa9-2f82-4792-88a0-6dca21949f46-ca-trust-extracted\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.233781 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.233723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50557aa9-2f82-4792-88a0-6dca21949f46-registry-certificates\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.233977 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.233819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50557aa9-2f82-4792-88a0-6dca21949f46-trusted-ca\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.236111 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.236084 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c334feeb-a7fc-4d45-ad79-e67520d0cd94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.236213 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.236148 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-registry-tls\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.236275 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.236208 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50557aa9-2f82-4792-88a0-6dca21949f46-image-registry-private-configuration\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.236275 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.236230 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50557aa9-2f82-4792-88a0-6dca21949f46-installation-pull-secrets\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.282811 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.282751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flcdf\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-kube-api-access-flcdf\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.307847 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.307818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50557aa9-2f82-4792-88a0-6dca21949f46-bound-sa-token\") pod \"image-registry-6bff679f46-648dd\" (UID: \"50557aa9-2f82-4792-88a0-6dca21949f46\") " pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.314070 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.314048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvh96\" (UniqueName: \"kubernetes.io/projected/c334feeb-a7fc-4d45-ad79-e67520d0cd94-kube-api-access-cvh96\") pod \"insights-runtime-extractor-8kgzf\" (UID: \"c334feeb-a7fc-4d45-ad79-e67520d0cd94\") " pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.339102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.339078 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:22.547951 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.547867 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6bff679f46-648dd"] Apr 24 21:29:22.562226 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.562205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8kgzf" Apr 24 21:29:22.733259 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:22.733205 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8kgzf"] Apr 24 21:29:22.737948 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:29:22.737921 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc334feeb_a7fc_4d45_ad79_e67520d0cd94.slice/crio-8898ec918e2e06d9a446cdc2440fb0ddf3484239c81a8af2ceedb060b3b135ab WatchSource:0}: Error finding container 8898ec918e2e06d9a446cdc2440fb0ddf3484239c81a8af2ceedb060b3b135ab: Status 404 returned error can't find the container with id 8898ec918e2e06d9a446cdc2440fb0ddf3484239c81a8af2ceedb060b3b135ab Apr 24 21:29:23.109508 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:23.109401 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bff679f46-648dd" event={"ID":"50557aa9-2f82-4792-88a0-6dca21949f46","Type":"ContainerStarted","Data":"7b3ee7489ebee5547710a75da0b6e2986690151a15469f3732edb23e347cdc61"} Apr 24 21:29:23.109508 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:23.109441 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bff679f46-648dd" event={"ID":"50557aa9-2f82-4792-88a0-6dca21949f46","Type":"ContainerStarted","Data":"47eac10a7bf68d49342360baf0cc21590825e10c0c1ccef8cd5ebb7e2beac67d"} Apr 24 21:29:23.109716 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:23.109523 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:23.111133 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:23.111107 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kgzf" event={"ID":"c334feeb-a7fc-4d45-ad79-e67520d0cd94","Type":"ContainerStarted","Data":"598d4654861686fea19c8bb550ddfaedf020f99d6de44797f08444769233fb39"} Apr 24 21:29:23.111253 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:23.111139 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kgzf" event={"ID":"c334feeb-a7fc-4d45-ad79-e67520d0cd94","Type":"ContainerStarted","Data":"8898ec918e2e06d9a446cdc2440fb0ddf3484239c81a8af2ceedb060b3b135ab"} Apr 24 21:29:23.142048 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:23.141995 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6bff679f46-648dd" podStartSLOduration=2.141981674 podStartE2EDuration="2.141981674s" podCreationTimestamp="2026-04-24 21:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:23.13974454 +0000 UTC m=+163.987788655" watchObservedRunningTime="2026-04-24 21:29:23.141981674 +0000 UTC m=+163.990025783" Apr 24 21:29:24.119104 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:24.119071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pjmg6" event={"ID":"3d01f7a9-76ee-487e-9801-6c420df8721a","Type":"ContainerStarted","Data":"898ea4e4d011d80384d903ae20b29a24a1b8ed92d670d4eb9ada75a38dcbdef0"} Apr 24 21:29:24.120550 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:24.120522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hmts4" event={"ID":"07d586c6-47b1-4dc6-96e0-7dac12734909","Type":"ContainerStarted","Data":"2ec262abd9884b29665eb2851d9370cd143d7146c376faeb31817e09a4bbc144"} Apr 24 21:29:24.140071 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:24.140020 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pjmg6" podStartSLOduration=129.10934515 podStartE2EDuration="2m11.140002822s" podCreationTimestamp="2026-04-24 21:27:13 +0000 UTC" firstStartedPulling="2026-04-24 21:29:21.784329389 +0000 UTC m=+162.632373482" lastFinishedPulling="2026-04-24 21:29:23.814987047 +0000 UTC m=+164.663031154" observedRunningTime="2026-04-24 21:29:24.138852805 +0000 UTC m=+164.986896920" watchObservedRunningTime="2026-04-24 21:29:24.140002822 +0000 UTC m=+164.988046937" Apr 24 21:29:25.124947 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:25.124901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kgzf" event={"ID":"c334feeb-a7fc-4d45-ad79-e67520d0cd94","Type":"ContainerStarted","Data":"26bbd86b305863d7db124cf84752f2c208a5077082a9becfc2e63b3e17f45067"} Apr 24 21:29:25.126629 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:25.126599 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hmts4" event={"ID":"07d586c6-47b1-4dc6-96e0-7dac12734909","Type":"ContainerStarted","Data":"f35b95a7b228b15d133dd5b284cc14668ff0d2e0cc146252ba2ab2efe1e48d93"} Apr 24 21:29:25.152656 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:25.152609 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hmts4" podStartSLOduration=130.127955096 podStartE2EDuration="2m12.152597089s" podCreationTimestamp="2026-04-24 21:27:13 +0000 UTC" firstStartedPulling="2026-04-24 21:29:21.791825539 +0000 UTC m=+162.639869633" lastFinishedPulling="2026-04-24 21:29:23.816467532 +0000 UTC m=+164.664511626" observedRunningTime="2026-04-24 21:29:25.151428104 +0000 UTC m=+165.999472220" watchObservedRunningTime="2026-04-24 21:29:25.152597089 +0000 UTC m=+166.000641203" Apr 24 21:29:26.131361 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:26.131329 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kgzf" event={"ID":"c334feeb-a7fc-4d45-ad79-e67520d0cd94","Type":"ContainerStarted","Data":"ef5b7a90f5826a96655964bea6f097249feb9858311228e59a6746b5d15218d6"} Apr 24 21:29:26.131769 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:26.131629 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hmts4" Apr 24 21:29:26.154495 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:26.154453 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8kgzf" podStartSLOduration=2.562513365 podStartE2EDuration="5.154440607s" podCreationTimestamp="2026-04-24 21:29:21 +0000 UTC" firstStartedPulling="2026-04-24 21:29:22.797322858 +0000 UTC m=+163.645366957" lastFinishedPulling="2026-04-24 21:29:25.389250105 +0000 UTC m=+166.237294199" observedRunningTime="2026-04-24 21:29:26.153155685 +0000 UTC m=+167.001199812" watchObservedRunningTime="2026-04-24 21:29:26.154440607 +0000 UTC m=+167.002484753" Apr 24 21:29:29.644510 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:29.644467 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:29:29.881919 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:29.881846 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" podUID="78b91d10-b1cb-49df-9404-87640d341585" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 24 21:29:30.142425 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:30.142356 2571 generic.go:358] "Generic (PLEG): container finished" podID="78b91d10-b1cb-49df-9404-87640d341585" containerID="1c042fda3a3518d92c50f4dabe12e7a56ba978bd5bc7f3d749753643e65a3a56" exitCode=1 Apr 24 21:29:30.142425 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:30.142407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" event={"ID":"78b91d10-b1cb-49df-9404-87640d341585","Type":"ContainerDied","Data":"1c042fda3a3518d92c50f4dabe12e7a56ba978bd5bc7f3d749753643e65a3a56"} Apr 24 21:29:30.142718 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:30.142706 2571 scope.go:117] "RemoveContainer" containerID="1c042fda3a3518d92c50f4dabe12e7a56ba978bd5bc7f3d749753643e65a3a56" Apr 24 21:29:31.146544 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:31.146510 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" event={"ID":"78b91d10-b1cb-49df-9404-87640d341585","Type":"ContainerStarted","Data":"9b0f923d06fc8f821f14d4769583ca201bf535c6b40008a5dea2d5a6babbf882"} Apr 24 21:29:31.146917 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:31.146797 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:29:31.147467 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:31.147447 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-848db9d749-mxw4x" Apr 24 21:29:33.136896 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:33.136867 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hmts4" Apr 24 21:29:37.298205 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.298167 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wgtgk"] Apr 24 21:29:37.345894 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.345850 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.348823 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.348792 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:29:37.348960 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.348792 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:29:37.349964 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.349945 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:29:37.349964 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.349955 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k4448\"" Apr 24 21:29:37.350150 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.350027 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:29:37.350150 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.350037 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:29:37.350150 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.350058 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:29:37.443249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443215 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-textfile\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443442 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443442 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0066d6d0-157c-4bef-89cb-0323a329a6a2-metrics-client-ca\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443442 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bvl\" (UniqueName: \"kubernetes.io/projected/0066d6d0-157c-4bef-89cb-0323a329a6a2-kube-api-access-97bvl\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443442 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-tls\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443442 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-wtmp\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443600 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-root\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443600 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-sys\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.443600 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.443554 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544634 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544634 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0066d6d0-157c-4bef-89cb-0323a329a6a2-metrics-client-ca\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97bvl\" (UniqueName: \"kubernetes.io/projected/0066d6d0-157c-4bef-89cb-0323a329a6a2-kube-api-access-97bvl\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-tls\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-wtmp\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-root\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-sys\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.544846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.545126 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-textfile\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.545126 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544870 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-sys\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.545126 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544870 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-root\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.545126 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.544962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-wtmp\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.545338 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.545175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-textfile\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.545383 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.545335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.545433 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.545418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0066d6d0-157c-4bef-89cb-0323a329a6a2-metrics-client-ca\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.547178 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.547155 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-tls\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.547279 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.547260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0066d6d0-157c-4bef-89cb-0323a329a6a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.553918 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.553875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bvl\" (UniqueName: \"kubernetes.io/projected/0066d6d0-157c-4bef-89cb-0323a329a6a2-kube-api-access-97bvl\") pod \"node-exporter-wgtgk\" (UID: \"0066d6d0-157c-4bef-89cb-0323a329a6a2\") " pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.654772 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:37.654749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wgtgk" Apr 24 21:29:37.665208 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:29:37.665180 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0066d6d0_157c_4bef_89cb_0323a329a6a2.slice/crio-a16213a304d9db7d87504b273681d0613f9d10c47f1a7767ad48de0e02b2826e WatchSource:0}: Error finding container a16213a304d9db7d87504b273681d0613f9d10c47f1a7767ad48de0e02b2826e: Status 404 returned error can't find the container with id a16213a304d9db7d87504b273681d0613f9d10c47f1a7767ad48de0e02b2826e Apr 24 21:29:38.165854 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:38.165815 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wgtgk" event={"ID":"0066d6d0-157c-4bef-89cb-0323a329a6a2","Type":"ContainerStarted","Data":"a16213a304d9db7d87504b273681d0613f9d10c47f1a7767ad48de0e02b2826e"} Apr 24 21:29:39.170223 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:39.170186 2571 generic.go:358] "Generic (PLEG): container finished" podID="0066d6d0-157c-4bef-89cb-0323a329a6a2" containerID="a4cce3f7645f1ef690147312b7b9a78150b4590aed0f0cbfc86da8e09ebfea99" exitCode=0 Apr 24 21:29:39.170633 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:39.170248 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wgtgk" event={"ID":"0066d6d0-157c-4bef-89cb-0323a329a6a2","Type":"ContainerDied","Data":"a4cce3f7645f1ef690147312b7b9a78150b4590aed0f0cbfc86da8e09ebfea99"} Apr 24 21:29:40.174649 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:40.174614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wgtgk" event={"ID":"0066d6d0-157c-4bef-89cb-0323a329a6a2","Type":"ContainerStarted","Data":"3f2b332c5cd451b90beb93930e0a24516e403e5b11e8a7666b0908d99aec2745"} Apr 24 21:29:40.174649 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:40.174649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wgtgk" event={"ID":"0066d6d0-157c-4bef-89cb-0323a329a6a2","Type":"ContainerStarted","Data":"92ec14100e0f3139a73b3aee322562b25f4f7734f5693ef5c22821aeccaa7ed6"} Apr 24 21:29:40.201191 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:40.201145 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wgtgk" podStartSLOduration=2.525335428 podStartE2EDuration="3.201131167s" podCreationTimestamp="2026-04-24 21:29:37 +0000 UTC" firstStartedPulling="2026-04-24 21:29:37.66737638 +0000 UTC m=+178.515420483" lastFinishedPulling="2026-04-24 21:29:38.343172125 +0000 UTC m=+179.191216222" observedRunningTime="2026-04-24 21:29:40.199379419 +0000 UTC m=+181.047423534" watchObservedRunningTime="2026-04-24 21:29:40.201131167 +0000 UTC m=+181.049175282" Apr 24 21:29:42.516261 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.516226 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-776974b45c-vbjbd"] Apr 24 21:29:42.518681 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.518667 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.521535 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.521509 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:29:42.521663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.521522 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:29:42.521663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.521553 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:29:42.521663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.521635 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-p5hhl\"" Apr 24 21:29:42.521925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.521910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:29:42.521973 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.521917 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:29:42.527091 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.527069 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:29:42.529549 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.529531 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-776974b45c-vbjbd"] Apr 24 21:29:42.585512 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585487 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-serving-certs-ca-bundle\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.585668 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-secret-telemeter-client\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.585668 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-telemeter-client-tls\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.585776 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585661 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.585776 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-metrics-client-ca\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.585776 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whqjw\" (UniqueName: \"kubernetes.io/projected/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-kube-api-access-whqjw\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.585921 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-federate-client-tls\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.585973 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.585943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-telemeter-trusted-ca-bundle\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687041 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687001 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-federate-client-tls\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-telemeter-trusted-ca-bundle\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-serving-certs-ca-bundle\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-secret-telemeter-client\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-telemeter-client-tls\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687218 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687513 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-metrics-client-ca\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.687513 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.687281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whqjw\" (UniqueName: \"kubernetes.io/projected/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-kube-api-access-whqjw\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.688371 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.688340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-metrics-client-ca\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.688624 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.688602 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-serving-certs-ca-bundle\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.688712 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.688686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-telemeter-trusted-ca-bundle\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.689897 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.689872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.690011 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.689907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-federate-client-tls\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.690011 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.689986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-secret-telemeter-client\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.690011 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.690000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-telemeter-client-tls\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.697419 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.697397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whqjw\" (UniqueName: \"kubernetes.io/projected/2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02-kube-api-access-whqjw\") pod \"telemeter-client-776974b45c-vbjbd\" (UID: \"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02\") " pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.827861 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.827791 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" Apr 24 21:29:42.950126 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:42.950097 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-776974b45c-vbjbd"] Apr 24 21:29:42.953071 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:29:42.953044 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd49d6e_f91b_4ecc_a1e8_d743b25c4a02.slice/crio-1269acc83ab3816c93105ffc536f703a6ed9f1accd0455f2346d5ce4abbaa537 WatchSource:0}: Error finding container 1269acc83ab3816c93105ffc536f703a6ed9f1accd0455f2346d5ce4abbaa537: Status 404 returned error can't find the container with id 1269acc83ab3816c93105ffc536f703a6ed9f1accd0455f2346d5ce4abbaa537 Apr 24 21:29:43.183046 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:43.182960 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" event={"ID":"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02","Type":"ContainerStarted","Data":"1269acc83ab3816c93105ffc536f703a6ed9f1accd0455f2346d5ce4abbaa537"} Apr 24 21:29:44.125544 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:44.125515 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6bff679f46-648dd" Apr 24 21:29:45.190502 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:45.190469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" event={"ID":"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02","Type":"ContainerStarted","Data":"d9871bb37fe37a46f72cc377972534bc7e4e63709e5e33b49350b9aea503ff0b"} Apr 24 21:29:46.195362 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:46.195256 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" event={"ID":"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02","Type":"ContainerStarted","Data":"e1a4a0585f2488ee6201dbf6b9de022e21dab1b743d7e8e900bad0757c8380d0"} Apr 24 21:29:46.195362 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:46.195312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" event={"ID":"2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02","Type":"ContainerStarted","Data":"15a5c5fcf5ceb20f20c6727d69ffc6efd0bcadd01337c8b82f5f09ffdfb5fd2c"} Apr 24 21:29:46.221540 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:46.221494 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-776974b45c-vbjbd" podStartSLOduration=1.246454374 podStartE2EDuration="4.221480448s" podCreationTimestamp="2026-04-24 21:29:42 +0000 UTC" firstStartedPulling="2026-04-24 21:29:42.954843058 +0000 UTC m=+183.802887152" lastFinishedPulling="2026-04-24 21:29:45.929869133 +0000 UTC m=+186.777913226" observedRunningTime="2026-04-24 21:29:46.220036985 +0000 UTC m=+187.068081100" watchObservedRunningTime="2026-04-24 21:29:46.221480448 +0000 UTC m=+187.069524564" Apr 24 21:29:55.782059 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:29:55.782012 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" podUID="0dc553d9-14cc-4d0f-8470-a69eed61b6b2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:05.782436 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:05.782396 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" podUID="0dc553d9-14cc-4d0f-8470-a69eed61b6b2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:15.783031 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:15.782993 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" podUID="0dc553d9-14cc-4d0f-8470-a69eed61b6b2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:15.783484 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:15.783059 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" Apr 24 21:30:15.783565 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:15.783536 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"206af984808a3d0e4b417e3687500e2b23aa1e47a73c5cae16ca24921456b700"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:30:15.783644 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:15.783601 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" podUID="0dc553d9-14cc-4d0f-8470-a69eed61b6b2" containerName="service-proxy" containerID="cri-o://206af984808a3d0e4b417e3687500e2b23aa1e47a73c5cae16ca24921456b700" gracePeriod=30 Apr 24 21:30:16.275445 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:16.275417 2571 generic.go:358] "Generic (PLEG): container finished" podID="0dc553d9-14cc-4d0f-8470-a69eed61b6b2" containerID="206af984808a3d0e4b417e3687500e2b23aa1e47a73c5cae16ca24921456b700" exitCode=2 Apr 24 21:30:16.275572 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:16.275462 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" event={"ID":"0dc553d9-14cc-4d0f-8470-a69eed61b6b2","Type":"ContainerDied","Data":"206af984808a3d0e4b417e3687500e2b23aa1e47a73c5cae16ca24921456b700"} Apr 24 21:30:16.275572 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:16.275489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b87c788b4-dks6b" event={"ID":"0dc553d9-14cc-4d0f-8470-a69eed61b6b2","Type":"ContainerStarted","Data":"1059603b8d0d758f072da06fc4661371dd357338051445e84fd2037e0ee735c6"} Apr 24 21:30:22.668653 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:22.668623 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wgtgk_0066d6d0-157c-4bef-89cb-0323a329a6a2/init-textfile/0.log" Apr 24 21:30:22.870668 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:22.870630 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wgtgk_0066d6d0-157c-4bef-89cb-0323a329a6a2/node-exporter/0.log" Apr 24 21:30:23.070787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:23.070757 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wgtgk_0066d6d0-157c-4bef-89cb-0323a329a6a2/kube-rbac-proxy/0.log" Apr 24 21:30:25.869981 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:25.869955 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-776974b45c-vbjbd_2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02/telemeter-client/0.log" Apr 24 21:30:26.069276 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:26.069238 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-776974b45c-vbjbd_2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02/reload/0.log" Apr 24 21:30:26.272840 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:26.272811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-776974b45c-vbjbd_2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02/kube-rbac-proxy/0.log" Apr 24 21:30:51.498702 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:51.498669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:30:51.501112 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:51.501088 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6db469f2-5afc-41c5-8338-9558deee2bd6-metrics-certs\") pod \"network-metrics-daemon-npqvg\" (UID: \"6db469f2-5afc-41c5-8338-9558deee2bd6\") " pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:30:51.548431 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:51.548408 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vlmsd\"" Apr 24 21:30:51.556007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:51.555991 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npqvg" Apr 24 21:30:51.670932 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:51.670908 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-npqvg"] Apr 24 21:30:51.673849 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:30:51.673817 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6db469f2_5afc_41c5_8338_9558deee2bd6.slice/crio-57855abebd9f52ec483062cda1b342c5e87c7efb33f9a112ccbf26abf37a5c99 WatchSource:0}: Error finding container 57855abebd9f52ec483062cda1b342c5e87c7efb33f9a112ccbf26abf37a5c99: Status 404 returned error can't find the container with id 57855abebd9f52ec483062cda1b342c5e87c7efb33f9a112ccbf26abf37a5c99 Apr 24 21:30:52.371754 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:52.371709 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npqvg" event={"ID":"6db469f2-5afc-41c5-8338-9558deee2bd6","Type":"ContainerStarted","Data":"57855abebd9f52ec483062cda1b342c5e87c7efb33f9a112ccbf26abf37a5c99"} Apr 24 21:30:53.375664 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:53.375631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npqvg" event={"ID":"6db469f2-5afc-41c5-8338-9558deee2bd6","Type":"ContainerStarted","Data":"fe5317597aa3f2f5133635787e9fe732b96114a0bd026e00880888c813cd4453"} Apr 24 21:30:53.375664 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:53.375665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npqvg" event={"ID":"6db469f2-5afc-41c5-8338-9558deee2bd6","Type":"ContainerStarted","Data":"82db5105e6af9d2d7638bf09ffce918a8e25e66ddd583d2c9b3fc90b4ae70a33"} Apr 24 21:30:53.398233 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:30:53.395344 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-npqvg" podStartSLOduration=253.541546846 podStartE2EDuration="4m14.395324808s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:30:51.675597016 +0000 UTC m=+252.523641110" lastFinishedPulling="2026-04-24 21:30:52.529374975 +0000 UTC m=+253.377419072" observedRunningTime="2026-04-24 21:30:53.394010525 +0000 UTC m=+254.242054665" watchObservedRunningTime="2026-04-24 21:30:53.395324808 +0000 UTC m=+254.243368924" Apr 24 21:31:32.362057 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.362021 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7dkks"] Apr 24 21:31:32.363763 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.363748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.374868 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:31:32.374846 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"original-pull-secret\" is forbidden: User \"system:node:ip-10-0-139-5.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-139-5.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"original-pull-secret\"" type="*v1.Secret" Apr 24 21:31:32.375056 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.375037 2571 status_manager.go:895] "Failed to get status for pod" podUID="2405d410-d9d7-4e59-96b9-c1697c0b1258" pod="kube-system/global-pull-secret-syncer-7dkks" err="pods \"global-pull-secret-syncer-7dkks\" is forbidden: User \"system:node:ip-10-0-139-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-139-5.ec2.internal' and this object" Apr 24 21:31:32.402070 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.402041 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7dkks"] Apr 24 21:31:32.481908 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.481872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2405d410-d9d7-4e59-96b9-c1697c0b1258-original-pull-secret\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.482059 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.481928 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2405d410-d9d7-4e59-96b9-c1697c0b1258-dbus\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.482059 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.482022 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2405d410-d9d7-4e59-96b9-c1697c0b1258-kubelet-config\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.582815 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.582778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2405d410-d9d7-4e59-96b9-c1697c0b1258-kubelet-config\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.583035 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.582830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2405d410-d9d7-4e59-96b9-c1697c0b1258-original-pull-secret\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.583035 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.582850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2405d410-d9d7-4e59-96b9-c1697c0b1258-dbus\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.583035 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.582907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2405d410-d9d7-4e59-96b9-c1697c0b1258-kubelet-config\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:32.583035 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:32.583016 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2405d410-d9d7-4e59-96b9-c1697c0b1258-dbus\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:33.583787 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:31:33.583747 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: failed to sync secret cache: timed out waiting for the condition Apr 24 21:31:33.584169 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:31:33.583838 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2405d410-d9d7-4e59-96b9-c1697c0b1258-original-pull-secret podName:2405d410-d9d7-4e59-96b9-c1697c0b1258 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:34.083820296 +0000 UTC m=+294.931864390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2405d410-d9d7-4e59-96b9-c1697c0b1258-original-pull-secret") pod "global-pull-secret-syncer-7dkks" (UID: "2405d410-d9d7-4e59-96b9-c1697c0b1258") : failed to sync secret cache: timed out waiting for the condition Apr 24 21:31:33.781887 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:33.781825 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:31:34.092899 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:34.092864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2405d410-d9d7-4e59-96b9-c1697c0b1258-original-pull-secret\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:34.095223 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:34.095207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2405d410-d9d7-4e59-96b9-c1697c0b1258-original-pull-secret\") pod \"global-pull-secret-syncer-7dkks\" (UID: \"2405d410-d9d7-4e59-96b9-c1697c0b1258\") " pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:34.171942 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:34.171898 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dkks" Apr 24 21:31:34.285089 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:34.285061 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7dkks"] Apr 24 21:31:34.288140 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:31:34.288113 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2405d410_d9d7_4e59_96b9_c1697c0b1258.slice/crio-933ce96a0042fb5ee96979738312e6a8c6a25b05290715315c9e298b9bbe6103 WatchSource:0}: Error finding container 933ce96a0042fb5ee96979738312e6a8c6a25b05290715315c9e298b9bbe6103: Status 404 returned error can't find the container with id 933ce96a0042fb5ee96979738312e6a8c6a25b05290715315c9e298b9bbe6103 Apr 24 21:31:34.485101 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:34.485023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7dkks" event={"ID":"2405d410-d9d7-4e59-96b9-c1697c0b1258","Type":"ContainerStarted","Data":"933ce96a0042fb5ee96979738312e6a8c6a25b05290715315c9e298b9bbe6103"} Apr 24 21:31:38.498021 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:38.497983 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7dkks" event={"ID":"2405d410-d9d7-4e59-96b9-c1697c0b1258","Type":"ContainerStarted","Data":"a66d9282ee25652a5f6d92d951c49d2dddec95759165bb6bcb3ae871c9b7caf3"} Apr 24 21:31:38.521744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:38.521697 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7dkks" podStartSLOduration=2.960133786 podStartE2EDuration="6.521682738s" podCreationTimestamp="2026-04-24 21:31:32 +0000 UTC" firstStartedPulling="2026-04-24 21:31:34.289715007 +0000 UTC m=+295.137759100" lastFinishedPulling="2026-04-24 21:31:37.851263959 +0000 UTC m=+298.699308052" observedRunningTime="2026-04-24 21:31:38.520205004 +0000 UTC m=+299.368249130" watchObservedRunningTime="2026-04-24 21:31:38.521682738 +0000 UTC m=+299.369726853" Apr 24 21:31:39.592128 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:31:39.592101 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:33:31.728439 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.728404 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-wm5mt"] Apr 24 21:33:31.730586 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.730565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:31.733624 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.733596 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4bgsb\"" Apr 24 21:33:31.734330 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.734291 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:33:31.734417 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.734291 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:33:31.734671 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.734657 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:33:31.740899 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.740876 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-wm5mt"] Apr 24 21:33:31.837386 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.837361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqch8\" (UniqueName: \"kubernetes.io/projected/9399f7cb-5834-4797-891d-20468636cd00-kube-api-access-pqch8\") pod \"seaweedfs-86cc847c5c-wm5mt\" (UID: \"9399f7cb-5834-4797-891d-20468636cd00\") " pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:31.837510 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.837399 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9399f7cb-5834-4797-891d-20468636cd00-data\") pod \"seaweedfs-86cc847c5c-wm5mt\" (UID: \"9399f7cb-5834-4797-891d-20468636cd00\") " pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:31.938128 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.938097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqch8\" (UniqueName: \"kubernetes.io/projected/9399f7cb-5834-4797-891d-20468636cd00-kube-api-access-pqch8\") pod \"seaweedfs-86cc847c5c-wm5mt\" (UID: \"9399f7cb-5834-4797-891d-20468636cd00\") " pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:31.938265 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.938135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9399f7cb-5834-4797-891d-20468636cd00-data\") pod \"seaweedfs-86cc847c5c-wm5mt\" (UID: \"9399f7cb-5834-4797-891d-20468636cd00\") " pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:31.938561 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.938542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9399f7cb-5834-4797-891d-20468636cd00-data\") pod \"seaweedfs-86cc847c5c-wm5mt\" (UID: \"9399f7cb-5834-4797-891d-20468636cd00\") " pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:31.946478 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:31.946456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqch8\" (UniqueName: \"kubernetes.io/projected/9399f7cb-5834-4797-891d-20468636cd00-kube-api-access-pqch8\") pod \"seaweedfs-86cc847c5c-wm5mt\" (UID: \"9399f7cb-5834-4797-891d-20468636cd00\") " pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:32.039481 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:32.039460 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:32.157137 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:32.157106 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-wm5mt"] Apr 24 21:33:32.160014 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:33:32.159987 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9399f7cb_5834_4797_891d_20468636cd00.slice/crio-6ec746d8b784b92f0897b6f754c0e657b11fcd883375279d5435fcaf4a5289d4 WatchSource:0}: Error finding container 6ec746d8b784b92f0897b6f754c0e657b11fcd883375279d5435fcaf4a5289d4: Status 404 returned error can't find the container with id 6ec746d8b784b92f0897b6f754c0e657b11fcd883375279d5435fcaf4a5289d4 Apr 24 21:33:32.161180 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:32.161163 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:33:32.786070 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:32.786021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-wm5mt" event={"ID":"9399f7cb-5834-4797-891d-20468636cd00","Type":"ContainerStarted","Data":"6ec746d8b784b92f0897b6f754c0e657b11fcd883375279d5435fcaf4a5289d4"} Apr 24 21:33:34.794717 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:34.794684 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-wm5mt" event={"ID":"9399f7cb-5834-4797-891d-20468636cd00","Type":"ContainerStarted","Data":"23b557e42df5799e1e1198b64ae52d0d2880b7e4ab66d56cf9ded78b69caa394"} Apr 24 21:33:34.795067 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:34.794814 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:33:34.813176 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:34.813128 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-wm5mt" podStartSLOduration=1.37418685 podStartE2EDuration="3.813117096s" podCreationTimestamp="2026-04-24 21:33:31 +0000 UTC" firstStartedPulling="2026-04-24 21:33:32.161353272 +0000 UTC m=+413.009397366" lastFinishedPulling="2026-04-24 21:33:34.600283516 +0000 UTC m=+415.448327612" observedRunningTime="2026-04-24 21:33:34.811733841 +0000 UTC m=+415.659777958" watchObservedRunningTime="2026-04-24 21:33:34.813117096 +0000 UTC m=+415.661161211" Apr 24 21:33:40.799467 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:33:40.799436 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-wm5mt" Apr 24 21:34:42.656626 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.656593 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-hqssg"] Apr 24 21:34:42.658534 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.658503 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:42.661630 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.661610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-8zn5s\"" Apr 24 21:34:42.661630 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.661623 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:34:42.672045 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.672017 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-hg7rz"] Apr 24 21:34:42.673876 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.673861 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:42.678515 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.678484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hqssg"] Apr 24 21:34:42.682110 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.682089 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:34:42.682551 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.682532 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-chx5b\"" Apr 24 21:34:42.697384 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.697347 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-hg7rz"] Apr 24 21:34:42.735415 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.735377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-cert\") pod \"odh-model-controller-696fc77849-hg7rz\" (UID: \"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e\") " pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:42.735415 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.735415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wbd\" (UniqueName: \"kubernetes.io/projected/cc602714-ec8b-41a5-b0c0-e3c463957643-kube-api-access-87wbd\") pod \"model-serving-api-86f7b4b499-hqssg\" (UID: \"cc602714-ec8b-41a5-b0c0-e3c463957643\") " pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:42.735611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.735474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc602714-ec8b-41a5-b0c0-e3c463957643-tls-certs\") pod \"model-serving-api-86f7b4b499-hqssg\" (UID: \"cc602714-ec8b-41a5-b0c0-e3c463957643\") " pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:42.735611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.735493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxfn\" (UniqueName: \"kubernetes.io/projected/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-kube-api-access-lrxfn\") pod \"odh-model-controller-696fc77849-hg7rz\" (UID: \"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e\") " pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:42.836479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.836438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-cert\") pod \"odh-model-controller-696fc77849-hg7rz\" (UID: \"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e\") " pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:42.836479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.836479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87wbd\" (UniqueName: \"kubernetes.io/projected/cc602714-ec8b-41a5-b0c0-e3c463957643-kube-api-access-87wbd\") pod \"model-serving-api-86f7b4b499-hqssg\" (UID: \"cc602714-ec8b-41a5-b0c0-e3c463957643\") " pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:42.836682 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.836528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc602714-ec8b-41a5-b0c0-e3c463957643-tls-certs\") pod \"model-serving-api-86f7b4b499-hqssg\" (UID: \"cc602714-ec8b-41a5-b0c0-e3c463957643\") " pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:42.836682 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.836554 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxfn\" (UniqueName: \"kubernetes.io/projected/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-kube-api-access-lrxfn\") pod \"odh-model-controller-696fc77849-hg7rz\" (UID: \"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e\") " pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:42.836682 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:34:42.836593 2571 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:34:42.836783 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:34:42.836683 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-cert podName:0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e nodeName:}" failed. No retries permitted until 2026-04-24 21:34:43.336661167 +0000 UTC m=+484.184705262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-cert") pod "odh-model-controller-696fc77849-hg7rz" (UID: "0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:34:42.839080 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.839058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc602714-ec8b-41a5-b0c0-e3c463957643-tls-certs\") pod \"model-serving-api-86f7b4b499-hqssg\" (UID: \"cc602714-ec8b-41a5-b0c0-e3c463957643\") " pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:42.846890 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.846867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxfn\" (UniqueName: \"kubernetes.io/projected/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-kube-api-access-lrxfn\") pod \"odh-model-controller-696fc77849-hg7rz\" (UID: \"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e\") " pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:42.849157 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.849138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wbd\" (UniqueName: \"kubernetes.io/projected/cc602714-ec8b-41a5-b0c0-e3c463957643-kube-api-access-87wbd\") pod \"model-serving-api-86f7b4b499-hqssg\" (UID: \"cc602714-ec8b-41a5-b0c0-e3c463957643\") " pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:42.967974 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:42.967887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:43.102321 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:43.102270 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hqssg"] Apr 24 21:34:43.105264 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:34:43.105233 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc602714_ec8b_41a5_b0c0_e3c463957643.slice/crio-12a5f853160da869966b9331eaefaa10d856d77fe487e0ac0a48a17bf0943cb4 WatchSource:0}: Error finding container 12a5f853160da869966b9331eaefaa10d856d77fe487e0ac0a48a17bf0943cb4: Status 404 returned error can't find the container with id 12a5f853160da869966b9331eaefaa10d856d77fe487e0ac0a48a17bf0943cb4 Apr 24 21:34:43.340142 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:43.340097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-cert\") pod \"odh-model-controller-696fc77849-hg7rz\" (UID: \"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e\") " pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:43.342686 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:43.342660 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e-cert\") pod \"odh-model-controller-696fc77849-hg7rz\" (UID: \"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e\") " pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:43.584817 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:43.584782 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:43.746256 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:43.746221 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-hg7rz"] Apr 24 21:34:43.749985 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:34:43.749952 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b6eb7d7_ac63_4eac_9e1c_9fe2ec80fa9e.slice/crio-a5418ad0a00e7e14081bce3a661ab8813e870268178000027d33242fe8b71504 WatchSource:0}: Error finding container a5418ad0a00e7e14081bce3a661ab8813e870268178000027d33242fe8b71504: Status 404 returned error can't find the container with id a5418ad0a00e7e14081bce3a661ab8813e870268178000027d33242fe8b71504 Apr 24 21:34:43.971059 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:43.970942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-hg7rz" event={"ID":"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e","Type":"ContainerStarted","Data":"a5418ad0a00e7e14081bce3a661ab8813e870268178000027d33242fe8b71504"} Apr 24 21:34:43.972269 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:43.972238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hqssg" event={"ID":"cc602714-ec8b-41a5-b0c0-e3c463957643","Type":"ContainerStarted","Data":"12a5f853160da869966b9331eaefaa10d856d77fe487e0ac0a48a17bf0943cb4"} Apr 24 21:34:46.983073 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:46.982964 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-hg7rz" event={"ID":"0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e","Type":"ContainerStarted","Data":"ca7e3fb214e3fa2bd7fece97eb5d907e27eee78a9135acc93c35ad04b1e8aff9"} Apr 24 21:34:46.983601 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:46.983078 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:46.984277 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:46.984257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hqssg" event={"ID":"cc602714-ec8b-41a5-b0c0-e3c463957643","Type":"ContainerStarted","Data":"886e8365c4f3cf97668e7f04df885db6fd7c998c9189fc0f6011315487d547cd"} Apr 24 21:34:46.984427 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:46.984416 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:34:47.034912 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:47.034859 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-hg7rz" podStartSLOduration=2.087460331 podStartE2EDuration="5.034844395s" podCreationTimestamp="2026-04-24 21:34:42 +0000 UTC" firstStartedPulling="2026-04-24 21:34:43.751782594 +0000 UTC m=+484.599826696" lastFinishedPulling="2026-04-24 21:34:46.699166663 +0000 UTC m=+487.547210760" observedRunningTime="2026-04-24 21:34:47.034330872 +0000 UTC m=+487.882374990" watchObservedRunningTime="2026-04-24 21:34:47.034844395 +0000 UTC m=+487.882888501" Apr 24 21:34:47.075706 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:47.075657 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-hqssg" podStartSLOduration=1.486752793 podStartE2EDuration="5.075642073s" podCreationTimestamp="2026-04-24 21:34:42 +0000 UTC" firstStartedPulling="2026-04-24 21:34:43.106870306 +0000 UTC m=+483.954914399" lastFinishedPulling="2026-04-24 21:34:46.695759584 +0000 UTC m=+487.543803679" observedRunningTime="2026-04-24 21:34:47.074894225 +0000 UTC m=+487.922938344" watchObservedRunningTime="2026-04-24 21:34:47.075642073 +0000 UTC m=+487.923686189" Apr 24 21:34:57.989725 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:57.989689 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-hg7rz" Apr 24 21:34:57.991567 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:34:57.991544 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-hqssg" Apr 24 21:35:10.148997 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.148962 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn"] Apr 24 21:35:10.151077 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.151060 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.156665 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.156635 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:35:10.169895 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.169864 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn"] Apr 24 21:35:10.249888 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.249838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98rf5\" (UniqueName: \"kubernetes.io/projected/f43d217d-379e-4986-82d3-2c847f29b0e0-kube-api-access-98rf5\") pod \"seaweedfs-tls-custom-ddd4dbfd-fjfjn\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.250073 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.249920 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f43d217d-379e-4986-82d3-2c847f29b0e0-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-fjfjn\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.351230 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.351187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98rf5\" (UniqueName: \"kubernetes.io/projected/f43d217d-379e-4986-82d3-2c847f29b0e0-kube-api-access-98rf5\") pod \"seaweedfs-tls-custom-ddd4dbfd-fjfjn\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.351439 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.351278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f43d217d-379e-4986-82d3-2c847f29b0e0-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-fjfjn\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.351699 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.351679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f43d217d-379e-4986-82d3-2c847f29b0e0-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-fjfjn\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.360012 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.359978 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98rf5\" (UniqueName: \"kubernetes.io/projected/f43d217d-379e-4986-82d3-2c847f29b0e0-kube-api-access-98rf5\") pod \"seaweedfs-tls-custom-ddd4dbfd-fjfjn\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.460490 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.460393 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:10.581177 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:10.581149 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn"] Apr 24 21:35:10.583940 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:35:10.583912 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43d217d_379e_4986_82d3_2c847f29b0e0.slice/crio-26b69f5f45bdf95d3579bd2468985a8dacbbae7deb363e603aedd4f268ea4fd2 WatchSource:0}: Error finding container 26b69f5f45bdf95d3579bd2468985a8dacbbae7deb363e603aedd4f268ea4fd2: Status 404 returned error can't find the container with id 26b69f5f45bdf95d3579bd2468985a8dacbbae7deb363e603aedd4f268ea4fd2 Apr 24 21:35:11.046649 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:11.046604 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" event={"ID":"f43d217d-379e-4986-82d3-2c847f29b0e0","Type":"ContainerStarted","Data":"a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b"} Apr 24 21:35:11.046649 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:11.046643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" event={"ID":"f43d217d-379e-4986-82d3-2c847f29b0e0","Type":"ContainerStarted","Data":"26b69f5f45bdf95d3579bd2468985a8dacbbae7deb363e603aedd4f268ea4fd2"} Apr 24 21:35:11.076836 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:11.076776 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" podStartSLOduration=0.820408484 podStartE2EDuration="1.076761806s" podCreationTimestamp="2026-04-24 21:35:10 +0000 UTC" firstStartedPulling="2026-04-24 21:35:10.585092612 +0000 UTC m=+511.433136706" lastFinishedPulling="2026-04-24 21:35:10.84144592 +0000 UTC m=+511.689490028" observedRunningTime="2026-04-24 21:35:11.076416414 +0000 UTC m=+511.924460530" watchObservedRunningTime="2026-04-24 21:35:11.076761806 +0000 UTC m=+511.924805922" Apr 24 21:35:12.501188 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:12.501152 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn"] Apr 24 21:35:13.052582 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:13.052522 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" podUID="f43d217d-379e-4986-82d3-2c847f29b0e0" containerName="seaweedfs-tls-custom" containerID="cri-o://a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b" gracePeriod=30 Apr 24 21:35:14.290218 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:14.290196 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:14.387710 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:14.387600 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98rf5\" (UniqueName: \"kubernetes.io/projected/f43d217d-379e-4986-82d3-2c847f29b0e0-kube-api-access-98rf5\") pod \"f43d217d-379e-4986-82d3-2c847f29b0e0\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " Apr 24 21:35:14.387710 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:14.387665 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f43d217d-379e-4986-82d3-2c847f29b0e0-data\") pod \"f43d217d-379e-4986-82d3-2c847f29b0e0\" (UID: \"f43d217d-379e-4986-82d3-2c847f29b0e0\") " Apr 24 21:35:14.388787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:14.388758 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43d217d-379e-4986-82d3-2c847f29b0e0-data" (OuterVolumeSpecName: "data") pod "f43d217d-379e-4986-82d3-2c847f29b0e0" (UID: "f43d217d-379e-4986-82d3-2c847f29b0e0"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:14.389930 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:14.389911 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43d217d-379e-4986-82d3-2c847f29b0e0-kube-api-access-98rf5" (OuterVolumeSpecName: "kube-api-access-98rf5") pod "f43d217d-379e-4986-82d3-2c847f29b0e0" (UID: "f43d217d-379e-4986-82d3-2c847f29b0e0"). InnerVolumeSpecName "kube-api-access-98rf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:14.488406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:14.488362 2571 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f43d217d-379e-4986-82d3-2c847f29b0e0-data\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:35:14.488406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:14.488403 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-98rf5\" (UniqueName: \"kubernetes.io/projected/f43d217d-379e-4986-82d3-2c847f29b0e0-kube-api-access-98rf5\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:35:15.059134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.059100 2571 generic.go:358] "Generic (PLEG): container finished" podID="f43d217d-379e-4986-82d3-2c847f29b0e0" containerID="a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b" exitCode=0 Apr 24 21:35:15.059337 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.059145 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" event={"ID":"f43d217d-379e-4986-82d3-2c847f29b0e0","Type":"ContainerDied","Data":"a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b"} Apr 24 21:35:15.059337 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.059169 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" event={"ID":"f43d217d-379e-4986-82d3-2c847f29b0e0","Type":"ContainerDied","Data":"26b69f5f45bdf95d3579bd2468985a8dacbbae7deb363e603aedd4f268ea4fd2"} Apr 24 21:35:15.059337 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.059188 2571 scope.go:117] "RemoveContainer" containerID="a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b" Apr 24 21:35:15.059337 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.059188 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn" Apr 24 21:35:15.068340 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.068322 2571 scope.go:117] "RemoveContainer" containerID="a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b" Apr 24 21:35:15.068601 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:35:15.068579 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b\": container with ID starting with a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b not found: ID does not exist" containerID="a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b" Apr 24 21:35:15.068664 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.068614 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b"} err="failed to get container status \"a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b\": rpc error: code = NotFound desc = could not find container \"a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b\": container with ID starting with a323041ef2bf3cfac46dbd37bbc95121ec4ecb5f6ef7e34226885d1c97f3876b not found: ID does not exist" Apr 24 21:35:15.086073 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.086036 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn"] Apr 24 21:35:15.091161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.091138 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-fjfjn"] Apr 24 21:35:15.135174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.135144 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw"] Apr 24 21:35:15.135528 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.135512 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f43d217d-379e-4986-82d3-2c847f29b0e0" containerName="seaweedfs-tls-custom" Apr 24 21:35:15.135528 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.135530 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43d217d-379e-4986-82d3-2c847f29b0e0" containerName="seaweedfs-tls-custom" Apr 24 21:35:15.135651 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.135573 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f43d217d-379e-4986-82d3-2c847f29b0e0" containerName="seaweedfs-tls-custom" Apr 24 21:35:15.138272 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.138256 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.141419 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.141398 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 24 21:35:15.141519 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.141401 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:35:15.147608 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.147587 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw"] Apr 24 21:35:15.294360 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.294288 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9n7s\" (UniqueName: \"kubernetes.io/projected/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-kube-api-access-h9n7s\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.294723 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.294374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.294723 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.294403 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.395673 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.395568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9n7s\" (UniqueName: \"kubernetes.io/projected/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-kube-api-access-h9n7s\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.395673 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.395618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.395673 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.395649 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.396004 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.395984 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.398318 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.398279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.404821 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.404802 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9n7s\" (UniqueName: \"kubernetes.io/projected/ffd29e2e-d014-48a5-bdfd-c85537d5fe45-kube-api-access-h9n7s\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcllw\" (UID: \"ffd29e2e-d014-48a5-bdfd-c85537d5fe45\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.447435 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.447403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" Apr 24 21:35:15.563847 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.563812 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw"] Apr 24 21:35:15.567561 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:35:15.567532 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffd29e2e_d014_48a5_bdfd_c85537d5fe45.slice/crio-9568a2310488ef633523a2f98a7ca8f62e53d59d4090cf02d010405a25eb38aa WatchSource:0}: Error finding container 9568a2310488ef633523a2f98a7ca8f62e53d59d4090cf02d010405a25eb38aa: Status 404 returned error can't find the container with id 9568a2310488ef633523a2f98a7ca8f62e53d59d4090cf02d010405a25eb38aa Apr 24 21:35:15.648777 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:15.648696 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43d217d-379e-4986-82d3-2c847f29b0e0" path="/var/lib/kubelet/pods/f43d217d-379e-4986-82d3-2c847f29b0e0/volumes" Apr 24 21:35:16.064403 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:16.064372 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" event={"ID":"ffd29e2e-d014-48a5-bdfd-c85537d5fe45","Type":"ContainerStarted","Data":"b394833112efba7f7f0e616bc5fd2012da0e1924ba5fc209985c267df69bc7d1"} Apr 24 21:35:16.064403 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:16.064403 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" event={"ID":"ffd29e2e-d014-48a5-bdfd-c85537d5fe45","Type":"ContainerStarted","Data":"9568a2310488ef633523a2f98a7ca8f62e53d59d4090cf02d010405a25eb38aa"} Apr 24 21:35:16.081500 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:16.081450 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcllw" podStartSLOduration=0.814715587 podStartE2EDuration="1.081438505s" podCreationTimestamp="2026-04-24 21:35:15 +0000 UTC" firstStartedPulling="2026-04-24 21:35:15.569245895 +0000 UTC m=+516.417289989" lastFinishedPulling="2026-04-24 21:35:15.835968807 +0000 UTC m=+516.684012907" observedRunningTime="2026-04-24 21:35:16.080627743 +0000 UTC m=+516.928671860" watchObservedRunningTime="2026-04-24 21:35:16.081438505 +0000 UTC m=+516.929482662" Apr 24 21:35:23.954656 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.954626 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb"] Apr 24 21:35:23.959417 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.959398 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:23.962000 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.961974 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cb1f1451-5b8c-4590-9836-205c97925933-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:23.962121 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.962015 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9lmw\" (UniqueName: \"kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-kube-api-access-d9lmw\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:23.962121 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.962048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:23.962220 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.962179 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 24 21:35:23.962220 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.962191 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 21:35:23.965026 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:23.965002 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb"] Apr 24 21:35:24.062365 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.062329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.062563 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.062377 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cb1f1451-5b8c-4590-9836-205c97925933-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.062563 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:35:24.062491 2571 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 24 21:35:24.062563 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.062498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9lmw\" (UniqueName: \"kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-kube-api-access-d9lmw\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.062563 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:35:24.062510 2571 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb: secret "seaweedfs-tls-serving" not found Apr 24 21:35:24.062778 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:35:24.062605 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-seaweedfs-tls-serving podName:cb1f1451-5b8c-4590-9836-205c97925933 nodeName:}" failed. No retries permitted until 2026-04-24 21:35:24.562588947 +0000 UTC m=+525.410633041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-sgxlb" (UID: "cb1f1451-5b8c-4590-9836-205c97925933") : secret "seaweedfs-tls-serving" not found Apr 24 21:35:24.062778 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.062710 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cb1f1451-5b8c-4590-9836-205c97925933-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.071785 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.071757 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9lmw\" (UniqueName: \"kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-kube-api-access-d9lmw\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.566231 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.566195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.568712 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.568679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/cb1f1451-5b8c-4590-9836-205c97925933-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sgxlb\" (UID: \"cb1f1451-5b8c-4590-9836-205c97925933\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.569455 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.569438 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" Apr 24 21:35:24.697602 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:24.697444 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb"] Apr 24 21:35:24.699721 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:35:24.699697 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb1f1451_5b8c_4590_9836_205c97925933.slice/crio-86c2d87d15c611788eccfb033f7514d81afdd4097f769fa97ff03746d5d76f16 WatchSource:0}: Error finding container 86c2d87d15c611788eccfb033f7514d81afdd4097f769fa97ff03746d5d76f16: Status 404 returned error can't find the container with id 86c2d87d15c611788eccfb033f7514d81afdd4097f769fa97ff03746d5d76f16 Apr 24 21:35:25.090932 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:25.090895 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" event={"ID":"cb1f1451-5b8c-4590-9836-205c97925933","Type":"ContainerStarted","Data":"861dedfd83171a24e39702e254202de706a0af4145bb88f4c6791170c4e41c52"} Apr 24 21:35:25.090932 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:25.090934 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" event={"ID":"cb1f1451-5b8c-4590-9836-205c97925933","Type":"ContainerStarted","Data":"86c2d87d15c611788eccfb033f7514d81afdd4097f769fa97ff03746d5d76f16"} Apr 24 21:35:25.110049 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:25.109992 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sgxlb" podStartSLOduration=1.8610119470000002 podStartE2EDuration="2.109977341s" podCreationTimestamp="2026-04-24 21:35:23 +0000 UTC" firstStartedPulling="2026-04-24 21:35:24.70099079 +0000 UTC m=+525.549034883" lastFinishedPulling="2026-04-24 21:35:24.949956181 +0000 UTC m=+525.798000277" observedRunningTime="2026-04-24 21:35:25.108164662 +0000 UTC m=+525.956208790" watchObservedRunningTime="2026-04-24 21:35:25.109977341 +0000 UTC m=+525.958021457" Apr 24 21:35:42.578694 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.578662 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x"] Apr 24 21:35:42.583440 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.583418 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.586258 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.586233 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 24 21:35:42.586487 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.586467 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:35:42.586559 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.586519 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-j8kq7\"" Apr 24 21:35:42.586722 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.586686 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:35:42.586722 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.586686 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 24 21:35:42.594819 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.594797 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x"] Apr 24 21:35:42.702772 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.702737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d18a6996-22e7-49b0-9757-075865792886-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.702772 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.702788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d18a6996-22e7-49b0-9757-075865792886-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.702987 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.702853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18a6996-22e7-49b0-9757-075865792886-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.702987 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.702887 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsmjh\" (UniqueName: \"kubernetes.io/projected/d18a6996-22e7-49b0-9757-075865792886-kube-api-access-vsmjh\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.804117 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.804074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d18a6996-22e7-49b0-9757-075865792886-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.804343 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.804131 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18a6996-22e7-49b0-9757-075865792886-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.804343 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.804160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsmjh\" (UniqueName: \"kubernetes.io/projected/d18a6996-22e7-49b0-9757-075865792886-kube-api-access-vsmjh\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.804343 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.804193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d18a6996-22e7-49b0-9757-075865792886-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.804649 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.804627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d18a6996-22e7-49b0-9757-075865792886-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.804884 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.804861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d18a6996-22e7-49b0-9757-075865792886-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.806750 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.806726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18a6996-22e7-49b0-9757-075865792886-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.813747 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.813727 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsmjh\" (UniqueName: \"kubernetes.io/projected/d18a6996-22e7-49b0-9757-075865792886-kube-api-access-vsmjh\") pod \"isvc-sklearn-batcher-predictor-66c65b668-gwk8x\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:42.895376 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:42.895279 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:35:43.036765 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:43.036735 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x"] Apr 24 21:35:43.040185 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:35:43.040155 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18a6996_22e7_49b0_9757_075865792886.slice/crio-9ce0177f96ad83502bf18f11af9d6fa7cebcfefd688d679b490a1d5d4640b275 WatchSource:0}: Error finding container 9ce0177f96ad83502bf18f11af9d6fa7cebcfefd688d679b490a1d5d4640b275: Status 404 returned error can't find the container with id 9ce0177f96ad83502bf18f11af9d6fa7cebcfefd688d679b490a1d5d4640b275 Apr 24 21:35:43.141274 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:43.141238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerStarted","Data":"9ce0177f96ad83502bf18f11af9d6fa7cebcfefd688d679b490a1d5d4640b275"} Apr 24 21:35:46.151534 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:46.151483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerStarted","Data":"9aebb3a1dfca637a92780a9f79efcbe5201d3c6fc567ad65492c75e63f3d6f3c"} Apr 24 21:35:50.164236 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:50.164148 2571 generic.go:358] "Generic (PLEG): container finished" podID="d18a6996-22e7-49b0-9757-075865792886" containerID="9aebb3a1dfca637a92780a9f79efcbe5201d3c6fc567ad65492c75e63f3d6f3c" exitCode=0 Apr 24 21:35:50.164236 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:35:50.164226 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerDied","Data":"9aebb3a1dfca637a92780a9f79efcbe5201d3c6fc567ad65492c75e63f3d6f3c"} Apr 24 21:36:04.216142 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:04.216090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerStarted","Data":"c9b1aad03d341efe1f5729ed292a3314d6970efb73051dbb4838394162b14baf"} Apr 24 21:36:06.224964 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:06.224863 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerStarted","Data":"53053a52ec385040eabecc0aeaeb49643b809e3fea4b985d54aafe5e0eab5bdb"} Apr 24 21:36:12.247436 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:12.247399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerStarted","Data":"b9d6bf0d21dbd65626600098054d2044e9c2fa98941e4af5d7eb0c1c120d55cb"} Apr 24 21:36:12.247971 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:12.247807 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:36:12.247971 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:12.247835 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:36:12.249253 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:12.249206 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:36:12.253138 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:12.253116 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:36:12.269622 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:12.269579 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podStartSLOduration=1.550218554 podStartE2EDuration="30.269567037s" podCreationTimestamp="2026-04-24 21:35:42 +0000 UTC" firstStartedPulling="2026-04-24 21:35:43.042118544 +0000 UTC m=+543.890162638" lastFinishedPulling="2026-04-24 21:36:11.761467012 +0000 UTC m=+572.609511121" observedRunningTime="2026-04-24 21:36:12.268398596 +0000 UTC m=+573.116442735" watchObservedRunningTime="2026-04-24 21:36:12.269567037 +0000 UTC m=+573.117611154" Apr 24 21:36:13.250465 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:13.250433 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:36:13.250888 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:13.250556 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:36:13.251093 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:13.251075 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:14.253603 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:14.253565 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:36:14.253989 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:14.253973 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:24.253706 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:24.253661 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:36:24.254250 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:24.254100 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:34.254210 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:34.254154 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:36:34.254729 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:34.254656 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:44.254237 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:44.254183 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:36:44.254737 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:44.254679 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:54.254431 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:54.254381 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:36:54.255101 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:36:54.255075 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:04.253688 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:04.253594 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:37:04.254140 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:04.254067 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:14.254090 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:14.254060 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:37:14.254562 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:14.254285 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:37:27.502513 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.502476 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x"] Apr 24 21:37:27.502966 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.502938 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" containerID="cri-o://c9b1aad03d341efe1f5729ed292a3314d6970efb73051dbb4838394162b14baf" gracePeriod=30 Apr 24 21:37:27.503018 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.502960 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" containerID="cri-o://b9d6bf0d21dbd65626600098054d2044e9c2fa98941e4af5d7eb0c1c120d55cb" gracePeriod=30 Apr 24 21:37:27.503018 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.502996 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" containerID="cri-o://53053a52ec385040eabecc0aeaeb49643b809e3fea4b985d54aafe5e0eab5bdb" gracePeriod=30 Apr 24 21:37:27.660630 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.660588 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp"] Apr 24 21:37:27.663722 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.663706 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.667060 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.667034 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 24 21:37:27.667155 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.667034 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 24 21:37:27.756167 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.756044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93d83c36-8a36-48bf-aa95-76e25d3071a2-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.756167 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.756089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpp7d\" (UniqueName: \"kubernetes.io/projected/93d83c36-8a36-48bf-aa95-76e25d3071a2-kube-api-access-rpp7d\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.756438 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.756189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93d83c36-8a36-48bf-aa95-76e25d3071a2-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.756438 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.756261 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93d83c36-8a36-48bf-aa95-76e25d3071a2-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.756642 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.756623 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp"] Apr 24 21:37:27.857506 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.857462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93d83c36-8a36-48bf-aa95-76e25d3071a2-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.857506 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.857506 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpp7d\" (UniqueName: \"kubernetes.io/projected/93d83c36-8a36-48bf-aa95-76e25d3071a2-kube-api-access-rpp7d\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.857761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.857530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93d83c36-8a36-48bf-aa95-76e25d3071a2-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.857761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.857651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93d83c36-8a36-48bf-aa95-76e25d3071a2-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.858053 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.858032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93d83c36-8a36-48bf-aa95-76e25d3071a2-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.858151 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.858134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93d83c36-8a36-48bf-aa95-76e25d3071a2-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.860120 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.860103 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93d83c36-8a36-48bf-aa95-76e25d3071a2-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.872642 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.872620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpp7d\" (UniqueName: \"kubernetes.io/projected/93d83c36-8a36-48bf-aa95-76e25d3071a2-kube-api-access-rpp7d\") pod \"isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:27.973500 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:27.973456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:28.107699 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:28.107666 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp"] Apr 24 21:37:28.108055 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:37:28.108030 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93d83c36_8a36_48bf_aa95_76e25d3071a2.slice/crio-3a0674764978907dabaed49bc80a982296fad48d6637a01087c994c199844b68 WatchSource:0}: Error finding container 3a0674764978907dabaed49bc80a982296fad48d6637a01087c994c199844b68: Status 404 returned error can't find the container with id 3a0674764978907dabaed49bc80a982296fad48d6637a01087c994c199844b68 Apr 24 21:37:28.481511 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:28.481433 2571 generic.go:358] "Generic (PLEG): container finished" podID="d18a6996-22e7-49b0-9757-075865792886" containerID="53053a52ec385040eabecc0aeaeb49643b809e3fea4b985d54aafe5e0eab5bdb" exitCode=2 Apr 24 21:37:28.481660 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:28.481509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerDied","Data":"53053a52ec385040eabecc0aeaeb49643b809e3fea4b985d54aafe5e0eab5bdb"} Apr 24 21:37:28.482796 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:28.482775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerStarted","Data":"660bc1aac2e4b4d51bb5da62557d6aea3e670ea3f7fea0f36ba100ad92841c79"} Apr 24 21:37:28.482796 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:28.482799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerStarted","Data":"3a0674764978907dabaed49bc80a982296fad48d6637a01087c994c199844b68"} Apr 24 21:37:32.248969 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:32.248931 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 24 21:37:32.494568 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:32.494535 2571 generic.go:358] "Generic (PLEG): container finished" podID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerID="660bc1aac2e4b4d51bb5da62557d6aea3e670ea3f7fea0f36ba100ad92841c79" exitCode=0 Apr 24 21:37:32.494732 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:32.494608 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerDied","Data":"660bc1aac2e4b4d51bb5da62557d6aea3e670ea3f7fea0f36ba100ad92841c79"} Apr 24 21:37:32.496902 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:32.496878 2571 generic.go:358] "Generic (PLEG): container finished" podID="d18a6996-22e7-49b0-9757-075865792886" containerID="c9b1aad03d341efe1f5729ed292a3314d6970efb73051dbb4838394162b14baf" exitCode=0 Apr 24 21:37:32.496996 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:32.496920 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerDied","Data":"c9b1aad03d341efe1f5729ed292a3314d6970efb73051dbb4838394162b14baf"} Apr 24 21:37:33.501774 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.501732 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerStarted","Data":"313bb1bf6f161de6ecf67910b68837a924d4f5b54a8d8c85f4fddd1e396dd3e0"} Apr 24 21:37:33.501774 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.501781 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerStarted","Data":"3f6ea854b17a3453babafaadd713857ff27cc1765a44769c9bfcb9924da8a77e"} Apr 24 21:37:33.502249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.501791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerStarted","Data":"12b3bf6becd47384f79a09ecf132ea1130d41a969c8cb9621672dc168e975e9e"} Apr 24 21:37:33.502249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.502238 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:33.502381 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.502267 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:33.502381 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.502279 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:33.503730 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.503700 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:37:33.504349 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.504329 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:33.524178 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:33.524138 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podStartSLOduration=6.524125034 podStartE2EDuration="6.524125034s" podCreationTimestamp="2026-04-24 21:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:37:33.523219867 +0000 UTC m=+654.371263993" watchObservedRunningTime="2026-04-24 21:37:33.524125034 +0000 UTC m=+654.372169150" Apr 24 21:37:34.254009 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:34.253960 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:37:34.254359 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:34.254330 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:34.505211 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:34.505118 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:37:34.505674 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:34.505582 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:37.248513 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:37.248474 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 24 21:37:39.509733 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:39.509699 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:37:39.510322 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:39.510267 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:37:39.510653 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:39.510634 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:42.248520 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:42.248479 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 24 21:37:42.248917 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:42.248620 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:37:44.253949 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:44.253913 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:37:44.254448 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:44.254336 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:47.248423 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:47.248372 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 24 21:37:49.510472 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:49.510432 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:37:49.510953 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:49.510784 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:52.248408 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:52.248364 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 24 21:37:54.254271 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:54.254227 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 24 21:37:54.254720 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:54.254402 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:37:54.254720 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:54.254674 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:54.254811 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:54.254768 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:37:57.248178 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.248135 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 24 21:37:57.572501 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.572078 2571 generic.go:358] "Generic (PLEG): container finished" podID="d18a6996-22e7-49b0-9757-075865792886" containerID="b9d6bf0d21dbd65626600098054d2044e9c2fa98941e4af5d7eb0c1c120d55cb" exitCode=0 Apr 24 21:37:57.572501 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.572140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerDied","Data":"b9d6bf0d21dbd65626600098054d2044e9c2fa98941e4af5d7eb0c1c120d55cb"} Apr 24 21:37:57.651607 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.651574 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:37:57.694639 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.694607 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d18a6996-22e7-49b0-9757-075865792886-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"d18a6996-22e7-49b0-9757-075865792886\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " Apr 24 21:37:57.694639 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.694655 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d18a6996-22e7-49b0-9757-075865792886-kserve-provision-location\") pod \"d18a6996-22e7-49b0-9757-075865792886\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " Apr 24 21:37:57.694854 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.694695 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsmjh\" (UniqueName: \"kubernetes.io/projected/d18a6996-22e7-49b0-9757-075865792886-kube-api-access-vsmjh\") pod \"d18a6996-22e7-49b0-9757-075865792886\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " Apr 24 21:37:57.694854 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.694753 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18a6996-22e7-49b0-9757-075865792886-proxy-tls\") pod \"d18a6996-22e7-49b0-9757-075865792886\" (UID: \"d18a6996-22e7-49b0-9757-075865792886\") " Apr 24 21:37:57.695030 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.695000 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d18a6996-22e7-49b0-9757-075865792886-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d18a6996-22e7-49b0-9757-075865792886" (UID: "d18a6996-22e7-49b0-9757-075865792886"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:57.695102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.695012 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d18a6996-22e7-49b0-9757-075865792886-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "d18a6996-22e7-49b0-9757-075865792886" (UID: "d18a6996-22e7-49b0-9757-075865792886"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:57.697075 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.697033 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18a6996-22e7-49b0-9757-075865792886-kube-api-access-vsmjh" (OuterVolumeSpecName: "kube-api-access-vsmjh") pod "d18a6996-22e7-49b0-9757-075865792886" (UID: "d18a6996-22e7-49b0-9757-075865792886"). InnerVolumeSpecName "kube-api-access-vsmjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:57.697075 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.697058 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18a6996-22e7-49b0-9757-075865792886-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d18a6996-22e7-49b0-9757-075865792886" (UID: "d18a6996-22e7-49b0-9757-075865792886"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:57.795534 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.795451 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vsmjh\" (UniqueName: \"kubernetes.io/projected/d18a6996-22e7-49b0-9757-075865792886-kube-api-access-vsmjh\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:37:57.795534 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.795482 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18a6996-22e7-49b0-9757-075865792886-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:37:57.795534 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.795493 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d18a6996-22e7-49b0-9757-075865792886-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:37:57.795534 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:57.795503 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d18a6996-22e7-49b0-9757-075865792886-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:37:58.576608 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.576572 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" event={"ID":"d18a6996-22e7-49b0-9757-075865792886","Type":"ContainerDied","Data":"9ce0177f96ad83502bf18f11af9d6fa7cebcfefd688d679b490a1d5d4640b275"} Apr 24 21:37:58.576608 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.576614 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x" Apr 24 21:37:58.577098 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.576624 2571 scope.go:117] "RemoveContainer" containerID="b9d6bf0d21dbd65626600098054d2044e9c2fa98941e4af5d7eb0c1c120d55cb" Apr 24 21:37:58.584560 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.584540 2571 scope.go:117] "RemoveContainer" containerID="53053a52ec385040eabecc0aeaeb49643b809e3fea4b985d54aafe5e0eab5bdb" Apr 24 21:37:58.591802 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.591781 2571 scope.go:117] "RemoveContainer" containerID="c9b1aad03d341efe1f5729ed292a3314d6970efb73051dbb4838394162b14baf" Apr 24 21:37:58.599320 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.599257 2571 scope.go:117] "RemoveContainer" containerID="9aebb3a1dfca637a92780a9f79efcbe5201d3c6fc567ad65492c75e63f3d6f3c" Apr 24 21:37:58.604446 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.602061 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x"] Apr 24 21:37:58.607675 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:58.607652 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-66c65b668-gwk8x"] Apr 24 21:37:59.510711 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:59.510675 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:37:59.511070 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:59.511048 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:59.647676 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:37:59.647638 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18a6996-22e7-49b0-9757-075865792886" path="/var/lib/kubelet/pods/d18a6996-22e7-49b0-9757-075865792886/volumes" Apr 24 21:38:09.510245 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:09.510197 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:38:09.510825 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:09.510771 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:19.510728 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:19.510682 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:38:19.511169 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:19.511140 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:29.510401 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:29.510276 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:38:29.510862 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:29.510833 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:39.510527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:39.510485 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:38:39.510999 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:39.510978 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:38:52.696010 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.695977 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp"] Apr 24 21:38:52.696525 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.696348 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" containerID="cri-o://12b3bf6becd47384f79a09ecf132ea1130d41a969c8cb9621672dc168e975e9e" gracePeriod=30 Apr 24 21:38:52.696525 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.696419 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" containerID="cri-o://313bb1bf6f161de6ecf67910b68837a924d4f5b54a8d8c85f4fddd1e396dd3e0" gracePeriod=30 Apr 24 21:38:52.696652 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.696602 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" containerID="cri-o://3f6ea854b17a3453babafaadd713857ff27cc1765a44769c9bfcb9924da8a77e" gracePeriod=30 Apr 24 21:38:52.746920 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.746887 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc"] Apr 24 21:38:52.747212 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747200 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" Apr 24 21:38:52.747254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747213 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" Apr 24 21:38:52.747254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747222 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="storage-initializer" Apr 24 21:38:52.747254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747228 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="storage-initializer" Apr 24 21:38:52.747254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747238 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" Apr 24 21:38:52.747254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747244 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" Apr 24 21:38:52.747254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747253 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" Apr 24 21:38:52.747522 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747260 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" Apr 24 21:38:52.747522 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747320 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="agent" Apr 24 21:38:52.747522 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747330 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kserve-container" Apr 24 21:38:52.747522 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.747339 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d18a6996-22e7-49b0-9757-075865792886" containerName="kube-rbac-proxy" Apr 24 21:38:52.751515 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.751497 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.754351 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.754328 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 24 21:38:52.754450 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.754384 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 24 21:38:52.761839 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.761817 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc"] Apr 24 21:38:52.843007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.842979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dceaf91-d766-45d9-a809-da6227a9a1b3-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.843166 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.843042 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xznsf\" (UniqueName: \"kubernetes.io/projected/5dceaf91-d766-45d9-a809-da6227a9a1b3-kube-api-access-xznsf\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.843166 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.843093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.944013 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.943963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xznsf\" (UniqueName: \"kubernetes.io/projected/5dceaf91-d766-45d9-a809-da6227a9a1b3-kube-api-access-xznsf\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.944212 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.944032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.944212 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.944079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dceaf91-d766-45d9-a809-da6227a9a1b3-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.944212 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:38:52.944190 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 24 21:38:52.944425 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:38:52.944259 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls podName:5dceaf91-d766-45d9-a809-da6227a9a1b3 nodeName:}" failed. No retries permitted until 2026-04-24 21:38:53.444242662 +0000 UTC m=+734.292286760 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-wnhtc" (UID: "5dceaf91-d766-45d9-a809-da6227a9a1b3") : secret "message-dumper-predictor-serving-cert" not found Apr 24 21:38:52.944710 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.944691 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dceaf91-d766-45d9-a809-da6227a9a1b3-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:52.952855 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:52.952790 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xznsf\" (UniqueName: \"kubernetes.io/projected/5dceaf91-d766-45d9-a809-da6227a9a1b3-kube-api-access-xznsf\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:53.448747 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:53.448706 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:53.451401 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:53.451380 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-wnhtc\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:53.661633 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:53.661592 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:53.732231 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:53.732196 2571 generic.go:358] "Generic (PLEG): container finished" podID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerID="3f6ea854b17a3453babafaadd713857ff27cc1765a44769c9bfcb9924da8a77e" exitCode=2 Apr 24 21:38:53.732787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:53.732249 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerDied","Data":"3f6ea854b17a3453babafaadd713857ff27cc1765a44769c9bfcb9924da8a77e"} Apr 24 21:38:53.790118 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:53.790079 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc"] Apr 24 21:38:53.793109 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:38:53.793075 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dceaf91_d766_45d9_a809_da6227a9a1b3.slice/crio-e7e14dce7654669b307b6c974d8f37532bb3647b24ada88c39e881730debc60d WatchSource:0}: Error finding container e7e14dce7654669b307b6c974d8f37532bb3647b24ada88c39e881730debc60d: Status 404 returned error can't find the container with id e7e14dce7654669b307b6c974d8f37532bb3647b24ada88c39e881730debc60d Apr 24 21:38:53.794947 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:53.794931 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:38:54.506097 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:54.506058 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 24 21:38:54.736755 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:54.736721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" event={"ID":"5dceaf91-d766-45d9-a809-da6227a9a1b3","Type":"ContainerStarted","Data":"e7e14dce7654669b307b6c974d8f37532bb3647b24ada88c39e881730debc60d"} Apr 24 21:38:55.741265 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:55.741230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" event={"ID":"5dceaf91-d766-45d9-a809-da6227a9a1b3","Type":"ContainerStarted","Data":"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da"} Apr 24 21:38:55.741265 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:55.741267 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" event={"ID":"5dceaf91-d766-45d9-a809-da6227a9a1b3","Type":"ContainerStarted","Data":"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb"} Apr 24 21:38:55.741720 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:55.741372 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:55.761911 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:55.761846 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" podStartSLOduration=2.653533156 podStartE2EDuration="3.761822695s" podCreationTimestamp="2026-04-24 21:38:52 +0000 UTC" firstStartedPulling="2026-04-24 21:38:53.79506072 +0000 UTC m=+734.643104814" lastFinishedPulling="2026-04-24 21:38:54.90335026 +0000 UTC m=+735.751394353" observedRunningTime="2026-04-24 21:38:55.760133602 +0000 UTC m=+736.608177718" watchObservedRunningTime="2026-04-24 21:38:55.761822695 +0000 UTC m=+736.609866812" Apr 24 21:38:56.744368 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:56.744328 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:56.746050 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:56.746028 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:38:57.749030 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:57.748997 2571 generic.go:358] "Generic (PLEG): container finished" podID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerID="12b3bf6becd47384f79a09ecf132ea1130d41a969c8cb9621672dc168e975e9e" exitCode=0 Apr 24 21:38:57.749440 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:57.749065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerDied","Data":"12b3bf6becd47384f79a09ecf132ea1130d41a969c8cb9621672dc168e975e9e"} Apr 24 21:38:59.505770 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:59.505724 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 24 21:38:59.511152 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:59.511121 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:38:59.511489 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:38:59.511465 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:03.756359 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:03.756315 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:39:04.505826 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:04.505784 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 24 21:39:04.506016 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:04.505945 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:39:09.506186 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:09.506141 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 24 21:39:09.510479 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:09.510451 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:39:09.512031 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:09.512007 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:12.787078 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.787041 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb"] Apr 24 21:39:12.790338 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.790310 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:12.792664 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.792643 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 24 21:39:12.792782 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.792681 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 24 21:39:12.803493 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.803466 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb"] Apr 24 21:39:12.916579 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.916542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5c641c0-e025-421b-a1ac-cba671e2b033-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:12.916791 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.916597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c641c0-e025-421b-a1ac-cba671e2b033-proxy-tls\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:12.916791 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.916643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5c641c0-e025-421b-a1ac-cba671e2b033-kserve-provision-location\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:12.916791 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:12.916751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45hjs\" (UniqueName: \"kubernetes.io/projected/f5c641c0-e025-421b-a1ac-cba671e2b033-kube-api-access-45hjs\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.017783 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.017741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c641c0-e025-421b-a1ac-cba671e2b033-proxy-tls\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.017968 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.017791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5c641c0-e025-421b-a1ac-cba671e2b033-kserve-provision-location\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.017968 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.017861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45hjs\" (UniqueName: \"kubernetes.io/projected/f5c641c0-e025-421b-a1ac-cba671e2b033-kube-api-access-45hjs\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.017968 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.017902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5c641c0-e025-421b-a1ac-cba671e2b033-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.018218 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.018185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5c641c0-e025-421b-a1ac-cba671e2b033-kserve-provision-location\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.018591 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.018573 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5c641c0-e025-421b-a1ac-cba671e2b033-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.020475 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.020455 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c641c0-e025-421b-a1ac-cba671e2b033-proxy-tls\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.027355 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.027333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45hjs\" (UniqueName: \"kubernetes.io/projected/f5c641c0-e025-421b-a1ac-cba671e2b033-kube-api-access-45hjs\") pod \"isvc-logger-predictor-d94d7847-vgjkb\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.100878 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.100776 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:13.230953 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.230924 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb"] Apr 24 21:39:13.233568 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:39:13.233540 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c641c0_e025_421b_a1ac_cba671e2b033.slice/crio-41db4b9587b17d2ed12d49d8ac0078e252e4f93e1df2287ffb6fe28431e09866 WatchSource:0}: Error finding container 41db4b9587b17d2ed12d49d8ac0078e252e4f93e1df2287ffb6fe28431e09866: Status 404 returned error can't find the container with id 41db4b9587b17d2ed12d49d8ac0078e252e4f93e1df2287ffb6fe28431e09866 Apr 24 21:39:13.794550 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.794511 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerStarted","Data":"59a3fe931962e86062d7a8c7d02748f9679ca42e791fcc9228e8e31d19938c59"} Apr 24 21:39:13.794550 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:13.794555 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerStarted","Data":"41db4b9587b17d2ed12d49d8ac0078e252e4f93e1df2287ffb6fe28431e09866"} Apr 24 21:39:14.505568 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:14.505527 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 24 21:39:16.805961 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:16.805926 2571 generic.go:358] "Generic (PLEG): container finished" podID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerID="59a3fe931962e86062d7a8c7d02748f9679ca42e791fcc9228e8e31d19938c59" exitCode=0 Apr 24 21:39:16.806325 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:16.806000 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerDied","Data":"59a3fe931962e86062d7a8c7d02748f9679ca42e791fcc9228e8e31d19938c59"} Apr 24 21:39:17.810853 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:17.810817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerStarted","Data":"b65b4dbf58df61a3a73ea4465105435965ff7980d0af222af025184c4c6d2400"} Apr 24 21:39:17.810853 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:17.810856 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerStarted","Data":"5b77add3688cbf13a83d9696874341828e54844c3045bdadb2d323513babcccd"} Apr 24 21:39:17.811249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:17.810872 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerStarted","Data":"7d725d6b4ccf75671e260c95aba06ec014a45351cd5a91b305f140981badc949"} Apr 24 21:39:17.811249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:17.811176 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:17.833380 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:17.833327 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podStartSLOduration=5.833287255 podStartE2EDuration="5.833287255s" podCreationTimestamp="2026-04-24 21:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:17.831592771 +0000 UTC m=+758.679636899" watchObservedRunningTime="2026-04-24 21:39:17.833287255 +0000 UTC m=+758.681331376" Apr 24 21:39:18.814428 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:18.814389 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:18.814428 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:18.814427 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:18.815794 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:18.815762 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:39:18.816498 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:18.816471 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:19.505467 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:19.505423 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 24 21:39:19.510783 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:19.510759 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:5000: connect: connection refused" Apr 24 21:39:19.510915 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:19.510902 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:39:19.511493 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:19.511470 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:19.511593 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:19.511563 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:39:19.817595 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:19.817490 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:39:19.818001 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:19.817977 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:22.828603 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.828572 2571 generic.go:358] "Generic (PLEG): container finished" podID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerID="313bb1bf6f161de6ecf67910b68837a924d4f5b54a8d8c85f4fddd1e396dd3e0" exitCode=0 Apr 24 21:39:22.828922 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.828636 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerDied","Data":"313bb1bf6f161de6ecf67910b68837a924d4f5b54a8d8c85f4fddd1e396dd3e0"} Apr 24 21:39:22.843561 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.843538 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:39:22.906271 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.906238 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93d83c36-8a36-48bf-aa95-76e25d3071a2-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"93d83c36-8a36-48bf-aa95-76e25d3071a2\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " Apr 24 21:39:22.906472 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.906291 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93d83c36-8a36-48bf-aa95-76e25d3071a2-proxy-tls\") pod \"93d83c36-8a36-48bf-aa95-76e25d3071a2\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " Apr 24 21:39:22.906472 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.906343 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpp7d\" (UniqueName: \"kubernetes.io/projected/93d83c36-8a36-48bf-aa95-76e25d3071a2-kube-api-access-rpp7d\") pod \"93d83c36-8a36-48bf-aa95-76e25d3071a2\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " Apr 24 21:39:22.906688 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.906664 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d83c36-8a36-48bf-aa95-76e25d3071a2-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "93d83c36-8a36-48bf-aa95-76e25d3071a2" (UID: "93d83c36-8a36-48bf-aa95-76e25d3071a2"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:22.908614 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.908590 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d83c36-8a36-48bf-aa95-76e25d3071a2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "93d83c36-8a36-48bf-aa95-76e25d3071a2" (UID: "93d83c36-8a36-48bf-aa95-76e25d3071a2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:22.908614 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:22.908597 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d83c36-8a36-48bf-aa95-76e25d3071a2-kube-api-access-rpp7d" (OuterVolumeSpecName: "kube-api-access-rpp7d") pod "93d83c36-8a36-48bf-aa95-76e25d3071a2" (UID: "93d83c36-8a36-48bf-aa95-76e25d3071a2"). InnerVolumeSpecName "kube-api-access-rpp7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:23.007363 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.007335 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93d83c36-8a36-48bf-aa95-76e25d3071a2-kserve-provision-location\") pod \"93d83c36-8a36-48bf-aa95-76e25d3071a2\" (UID: \"93d83c36-8a36-48bf-aa95-76e25d3071a2\") " Apr 24 21:39:23.007625 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.007606 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93d83c36-8a36-48bf-aa95-76e25d3071a2-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:39:23.007670 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.007632 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93d83c36-8a36-48bf-aa95-76e25d3071a2-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:39:23.007670 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.007646 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpp7d\" (UniqueName: \"kubernetes.io/projected/93d83c36-8a36-48bf-aa95-76e25d3071a2-kube-api-access-rpp7d\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:39:23.007736 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.007678 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d83c36-8a36-48bf-aa95-76e25d3071a2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "93d83c36-8a36-48bf-aa95-76e25d3071a2" (UID: "93d83c36-8a36-48bf-aa95-76e25d3071a2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:39:23.108136 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.108090 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93d83c36-8a36-48bf-aa95-76e25d3071a2-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:39:23.834119 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.834082 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" event={"ID":"93d83c36-8a36-48bf-aa95-76e25d3071a2","Type":"ContainerDied","Data":"3a0674764978907dabaed49bc80a982296fad48d6637a01087c994c199844b68"} Apr 24 21:39:23.834596 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.834130 2571 scope.go:117] "RemoveContainer" containerID="313bb1bf6f161de6ecf67910b68837a924d4f5b54a8d8c85f4fddd1e396dd3e0" Apr 24 21:39:23.834596 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.834176 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp" Apr 24 21:39:23.841886 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.841867 2571 scope.go:117] "RemoveContainer" containerID="3f6ea854b17a3453babafaadd713857ff27cc1765a44769c9bfcb9924da8a77e" Apr 24 21:39:23.848941 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.848912 2571 scope.go:117] "RemoveContainer" containerID="12b3bf6becd47384f79a09ecf132ea1130d41a969c8cb9621672dc168e975e9e" Apr 24 21:39:23.855510 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.855486 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp"] Apr 24 21:39:23.856359 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.856343 2571 scope.go:117] "RemoveContainer" containerID="660bc1aac2e4b4d51bb5da62557d6aea3e670ea3f7fea0f36ba100ad92841c79" Apr 24 21:39:23.860446 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:23.860425 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7cdbc5689-hhwbp"] Apr 24 21:39:24.822539 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:24.822507 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:39:24.823122 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:24.823079 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:39:24.823622 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:24.823594 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:25.648488 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:25.648453 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" path="/var/lib/kubelet/pods/93d83c36-8a36-48bf-aa95-76e25d3071a2/volumes" Apr 24 21:39:34.823467 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:34.823413 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:39:34.824009 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:34.823860 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:44.823544 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:44.823502 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:39:44.824024 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:44.823988 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:54.823527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:54.823477 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:39:54.824092 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:39:54.824068 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:04.823492 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:04.823400 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:40:04.824018 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:04.823984 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:14.823214 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:14.823172 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:40:14.824098 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:14.824066 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:24.824462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:24.824431 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:40:24.824923 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:24.824511 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:40:37.847315 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:37.847264 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-wnhtc_5dceaf91-d766-45d9-a809-da6227a9a1b3/kserve-container/0.log" Apr 24 21:40:38.028429 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.028398 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb"] Apr 24 21:40:38.028897 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.028840 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" containerID="cri-o://7d725d6b4ccf75671e260c95aba06ec014a45351cd5a91b305f140981badc949" gracePeriod=30 Apr 24 21:40:38.028989 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.028874 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" containerID="cri-o://b65b4dbf58df61a3a73ea4465105435965ff7980d0af222af025184c4c6d2400" gracePeriod=30 Apr 24 21:40:38.028989 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.028874 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" containerID="cri-o://5b77add3688cbf13a83d9696874341828e54844c3045bdadb2d323513babcccd" gracePeriod=30 Apr 24 21:40:38.087889 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.087858 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc"] Apr 24 21:40:38.088171 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088160 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="storage-initializer" Apr 24 21:40:38.088215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088173 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="storage-initializer" Apr 24 21:40:38.088215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088182 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" Apr 24 21:40:38.088215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088187 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" Apr 24 21:40:38.088215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088196 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" Apr 24 21:40:38.088215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088201 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" Apr 24 21:40:38.088215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088216 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" Apr 24 21:40:38.088420 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088221 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" Apr 24 21:40:38.088420 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088264 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kserve-container" Apr 24 21:40:38.088420 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088271 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="agent" Apr 24 21:40:38.088420 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.088279 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93d83c36-8a36-48bf-aa95-76e25d3071a2" containerName="kube-rbac-proxy" Apr 24 21:40:38.091274 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.091258 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.094163 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.094134 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 21:40:38.094278 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.094194 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 24 21:40:38.101775 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.101724 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc"] Apr 24 21:40:38.143702 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.143675 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc"] Apr 24 21:40:38.143958 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.143935 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kserve-container" containerID="cri-o://e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb" gracePeriod=30 Apr 24 21:40:38.144041 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.143969 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kube-rbac-proxy" containerID="cri-o://dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da" gracePeriod=30 Apr 24 21:40:38.191127 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.191087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70d29832-055e-49bd-8770-e6e15e529545-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.191263 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.191159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.191263 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.191205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9wn\" (UniqueName: \"kubernetes.io/projected/70d29832-055e-49bd-8770-e6e15e529545-kube-api-access-qv9wn\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.191263 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.191231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70d29832-055e-49bd-8770-e6e15e529545-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.291827 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.291793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.292001 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.291848 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9wn\" (UniqueName: \"kubernetes.io/projected/70d29832-055e-49bd-8770-e6e15e529545-kube-api-access-qv9wn\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.292001 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.291871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70d29832-055e-49bd-8770-e6e15e529545-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.292001 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.291937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70d29832-055e-49bd-8770-e6e15e529545-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.292001 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:40:38.291949 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:40:38.292205 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:40:38.292012 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls podName:70d29832-055e-49bd-8770-e6e15e529545 nodeName:}" failed. No retries permitted until 2026-04-24 21:40:38.791995556 +0000 UTC m=+839.640039649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-l5wbc" (UID: "70d29832-055e-49bd-8770-e6e15e529545") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 21:40:38.292428 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.292404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70d29832-055e-49bd-8770-e6e15e529545-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.292667 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.292648 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70d29832-055e-49bd-8770-e6e15e529545-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.301535 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.301513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9wn\" (UniqueName: \"kubernetes.io/projected/70d29832-055e-49bd-8770-e6e15e529545-kube-api-access-qv9wn\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.380891 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.380869 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:40:38.392413 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.392392 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xznsf\" (UniqueName: \"kubernetes.io/projected/5dceaf91-d766-45d9-a809-da6227a9a1b3-kube-api-access-xznsf\") pod \"5dceaf91-d766-45d9-a809-da6227a9a1b3\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " Apr 24 21:40:38.392506 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.392433 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dceaf91-d766-45d9-a809-da6227a9a1b3-message-dumper-kube-rbac-proxy-sar-config\") pod \"5dceaf91-d766-45d9-a809-da6227a9a1b3\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " Apr 24 21:40:38.392506 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.392487 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls\") pod \"5dceaf91-d766-45d9-a809-da6227a9a1b3\" (UID: \"5dceaf91-d766-45d9-a809-da6227a9a1b3\") " Apr 24 21:40:38.392804 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.392782 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dceaf91-d766-45d9-a809-da6227a9a1b3-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "5dceaf91-d766-45d9-a809-da6227a9a1b3" (UID: "5dceaf91-d766-45d9-a809-da6227a9a1b3"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:40:38.394523 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.394503 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5dceaf91-d766-45d9-a809-da6227a9a1b3" (UID: "5dceaf91-d766-45d9-a809-da6227a9a1b3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:38.394583 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.394560 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dceaf91-d766-45d9-a809-da6227a9a1b3-kube-api-access-xznsf" (OuterVolumeSpecName: "kube-api-access-xznsf") pod "5dceaf91-d766-45d9-a809-da6227a9a1b3" (UID: "5dceaf91-d766-45d9-a809-da6227a9a1b3"). InnerVolumeSpecName "kube-api-access-xznsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:38.493351 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.493322 2571 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dceaf91-d766-45d9-a809-da6227a9a1b3-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.493351 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.493350 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dceaf91-d766-45d9-a809-da6227a9a1b3-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.493538 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.493362 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xznsf\" (UniqueName: \"kubernetes.io/projected/5dceaf91-d766-45d9-a809-da6227a9a1b3-kube-api-access-xznsf\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:40:38.796451 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.796423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:38.798869 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:38.798852 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-l5wbc\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:39.001733 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.001699 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:39.047686 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.047610 2571 generic.go:358] "Generic (PLEG): container finished" podID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerID="dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da" exitCode=2 Apr 24 21:40:39.047686 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.047644 2571 generic.go:358] "Generic (PLEG): container finished" podID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerID="e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb" exitCode=2 Apr 24 21:40:39.047873 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.047693 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" Apr 24 21:40:39.047873 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.047692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" event={"ID":"5dceaf91-d766-45d9-a809-da6227a9a1b3","Type":"ContainerDied","Data":"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da"} Apr 24 21:40:39.047873 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.047733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" event={"ID":"5dceaf91-d766-45d9-a809-da6227a9a1b3","Type":"ContainerDied","Data":"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb"} Apr 24 21:40:39.047873 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.047749 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc" event={"ID":"5dceaf91-d766-45d9-a809-da6227a9a1b3","Type":"ContainerDied","Data":"e7e14dce7654669b307b6c974d8f37532bb3647b24ada88c39e881730debc60d"} Apr 24 21:40:39.047873 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.047769 2571 scope.go:117] "RemoveContainer" containerID="dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da" Apr 24 21:40:39.053158 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.052681 2571 generic.go:358] "Generic (PLEG): container finished" podID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerID="5b77add3688cbf13a83d9696874341828e54844c3045bdadb2d323513babcccd" exitCode=2 Apr 24 21:40:39.053158 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.052736 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerDied","Data":"5b77add3688cbf13a83d9696874341828e54844c3045bdadb2d323513babcccd"} Apr 24 21:40:39.057850 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.057834 2571 scope.go:117] "RemoveContainer" containerID="e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb" Apr 24 21:40:39.067393 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.067372 2571 scope.go:117] "RemoveContainer" containerID="dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da" Apr 24 21:40:39.067726 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:40:39.067699 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da\": container with ID starting with dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da not found: ID does not exist" containerID="dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da" Apr 24 21:40:39.067786 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.067741 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da"} err="failed to get container status \"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da\": rpc error: code = NotFound desc = could not find container \"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da\": container with ID starting with dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da not found: ID does not exist" Apr 24 21:40:39.067786 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.067771 2571 scope.go:117] "RemoveContainer" containerID="e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb" Apr 24 21:40:39.068053 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:40:39.068025 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb\": container with ID starting with e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb not found: ID does not exist" containerID="e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb" Apr 24 21:40:39.068116 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.068064 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb"} err="failed to get container status \"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb\": rpc error: code = NotFound desc = could not find container \"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb\": container with ID starting with e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb not found: ID does not exist" Apr 24 21:40:39.068116 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.068087 2571 scope.go:117] "RemoveContainer" containerID="dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da" Apr 24 21:40:39.068500 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.068463 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da"} err="failed to get container status \"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da\": rpc error: code = NotFound desc = could not find container \"dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da\": container with ID starting with dc6cabc537ac1c73299040b96cad261e422a4cf4ecaf19b62c6562fa1c4287da not found: ID does not exist" Apr 24 21:40:39.068561 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.068502 2571 scope.go:117] "RemoveContainer" containerID="e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb" Apr 24 21:40:39.068747 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.068729 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb"} err="failed to get container status \"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb\": rpc error: code = NotFound desc = could not find container \"e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb\": container with ID starting with e03d501df1d21777bdc08f94adf172c5c1be932e875984baf393c6bb396ec5cb not found: ID does not exist" Apr 24 21:40:39.083407 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.083384 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc"] Apr 24 21:40:39.091313 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.091268 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-wnhtc"] Apr 24 21:40:39.134833 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.134697 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc"] Apr 24 21:40:39.137511 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:40:39.137481 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d29832_055e_49bd_8770_e6e15e529545.slice/crio-17abaa14a9c66c3b7388527e08f4e0e9d29c16936a39e3f2de85587df6c7a2ba WatchSource:0}: Error finding container 17abaa14a9c66c3b7388527e08f4e0e9d29c16936a39e3f2de85587df6c7a2ba: Status 404 returned error can't find the container with id 17abaa14a9c66c3b7388527e08f4e0e9d29c16936a39e3f2de85587df6c7a2ba Apr 24 21:40:39.650094 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.650053 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" path="/var/lib/kubelet/pods/5dceaf91-d766-45d9-a809-da6227a9a1b3/volumes" Apr 24 21:40:39.818171 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:39.818133 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:40:40.057200 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:40.057160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerStarted","Data":"6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f"} Apr 24 21:40:40.057200 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:40.057205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerStarted","Data":"17abaa14a9c66c3b7388527e08f4e0e9d29c16936a39e3f2de85587df6c7a2ba"} Apr 24 21:40:43.066755 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:43.066719 2571 generic.go:358] "Generic (PLEG): container finished" podID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerID="7d725d6b4ccf75671e260c95aba06ec014a45351cd5a91b305f140981badc949" exitCode=0 Apr 24 21:40:43.067152 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:43.066785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerDied","Data":"7d725d6b4ccf75671e260c95aba06ec014a45351cd5a91b305f140981badc949"} Apr 24 21:40:43.068083 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:43.068064 2571 generic.go:358] "Generic (PLEG): container finished" podID="70d29832-055e-49bd-8770-e6e15e529545" containerID="6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f" exitCode=0 Apr 24 21:40:43.068206 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:43.068110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerDied","Data":"6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f"} Apr 24 21:40:44.818594 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:44.818554 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:40:44.823431 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:44.823326 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:40:44.823595 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:44.823482 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:49.817759 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:49.817722 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:40:49.818161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:49.817842 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:40:50.091390 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:50.091287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerStarted","Data":"9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9"} Apr 24 21:40:50.091390 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:50.091347 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerStarted","Data":"c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614"} Apr 24 21:40:50.091584 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:50.091566 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:50.110544 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:50.110490 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podStartSLOduration=5.949398211 podStartE2EDuration="12.110473509s" podCreationTimestamp="2026-04-24 21:40:38 +0000 UTC" firstStartedPulling="2026-04-24 21:40:43.069409287 +0000 UTC m=+843.917453381" lastFinishedPulling="2026-04-24 21:40:49.230484581 +0000 UTC m=+850.078528679" observedRunningTime="2026-04-24 21:40:50.109895195 +0000 UTC m=+850.957939312" watchObservedRunningTime="2026-04-24 21:40:50.110473509 +0000 UTC m=+850.958517626" Apr 24 21:40:51.094823 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:51.094794 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:51.096077 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:51.096054 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:40:52.097744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:52.097700 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:40:54.817884 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:54.817846 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:40:54.823328 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:54.823271 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:40:54.823629 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:54.823604 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:57.102434 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:57.102405 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:40:57.102973 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:57.102950 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:40:59.818165 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:40:59.818125 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:41:04.817896 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:04.817852 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:41:04.823240 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:04.823200 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:41:04.823406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:04.823377 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:41:04.823692 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:04.823661 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:04.823805 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:04.823792 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:41:07.103924 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:07.103879 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:41:08.143391 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.143345 2571 generic.go:358] "Generic (PLEG): container finished" podID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerID="b65b4dbf58df61a3a73ea4465105435965ff7980d0af222af025184c4c6d2400" exitCode=0 Apr 24 21:41:08.143713 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.143399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerDied","Data":"b65b4dbf58df61a3a73ea4465105435965ff7980d0af222af025184c4c6d2400"} Apr 24 21:41:08.165113 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.165087 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:41:08.232237 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.232200 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5c641c0-e025-421b-a1ac-cba671e2b033-kserve-provision-location\") pod \"f5c641c0-e025-421b-a1ac-cba671e2b033\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " Apr 24 21:41:08.232403 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.232261 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5c641c0-e025-421b-a1ac-cba671e2b033-isvc-logger-kube-rbac-proxy-sar-config\") pod \"f5c641c0-e025-421b-a1ac-cba671e2b033\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " Apr 24 21:41:08.232403 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.232340 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45hjs\" (UniqueName: \"kubernetes.io/projected/f5c641c0-e025-421b-a1ac-cba671e2b033-kube-api-access-45hjs\") pod \"f5c641c0-e025-421b-a1ac-cba671e2b033\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " Apr 24 21:41:08.232403 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.232384 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c641c0-e025-421b-a1ac-cba671e2b033-proxy-tls\") pod \"f5c641c0-e025-421b-a1ac-cba671e2b033\" (UID: \"f5c641c0-e025-421b-a1ac-cba671e2b033\") " Apr 24 21:41:08.232582 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.232559 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c641c0-e025-421b-a1ac-cba671e2b033-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f5c641c0-e025-421b-a1ac-cba671e2b033" (UID: "f5c641c0-e025-421b-a1ac-cba671e2b033"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:41:08.232672 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.232656 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5c641c0-e025-421b-a1ac-cba671e2b033-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:41:08.232721 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.232670 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c641c0-e025-421b-a1ac-cba671e2b033-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "f5c641c0-e025-421b-a1ac-cba671e2b033" (UID: "f5c641c0-e025-421b-a1ac-cba671e2b033"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:08.234645 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.234624 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c641c0-e025-421b-a1ac-cba671e2b033-kube-api-access-45hjs" (OuterVolumeSpecName: "kube-api-access-45hjs") pod "f5c641c0-e025-421b-a1ac-cba671e2b033" (UID: "f5c641c0-e025-421b-a1ac-cba671e2b033"). InnerVolumeSpecName "kube-api-access-45hjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:41:08.234773 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.234754 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c641c0-e025-421b-a1ac-cba671e2b033-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f5c641c0-e025-421b-a1ac-cba671e2b033" (UID: "f5c641c0-e025-421b-a1ac-cba671e2b033"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:08.334031 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.333923 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-45hjs\" (UniqueName: \"kubernetes.io/projected/f5c641c0-e025-421b-a1ac-cba671e2b033-kube-api-access-45hjs\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:41:08.334031 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.333975 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c641c0-e025-421b-a1ac-cba671e2b033-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:41:08.334031 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:08.333988 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5c641c0-e025-421b-a1ac-cba671e2b033-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:41:09.147708 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.147675 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" event={"ID":"f5c641c0-e025-421b-a1ac-cba671e2b033","Type":"ContainerDied","Data":"41db4b9587b17d2ed12d49d8ac0078e252e4f93e1df2287ffb6fe28431e09866"} Apr 24 21:41:09.148134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.147721 2571 scope.go:117] "RemoveContainer" containerID="b65b4dbf58df61a3a73ea4465105435965ff7980d0af222af025184c4c6d2400" Apr 24 21:41:09.148134 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.147748 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb" Apr 24 21:41:09.156036 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.156019 2571 scope.go:117] "RemoveContainer" containerID="5b77add3688cbf13a83d9696874341828e54844c3045bdadb2d323513babcccd" Apr 24 21:41:09.163065 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.163041 2571 scope.go:117] "RemoveContainer" containerID="7d725d6b4ccf75671e260c95aba06ec014a45351cd5a91b305f140981badc949" Apr 24 21:41:09.169664 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.169639 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb"] Apr 24 21:41:09.170320 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.170283 2571 scope.go:117] "RemoveContainer" containerID="59a3fe931962e86062d7a8c7d02748f9679ca42e791fcc9228e8e31d19938c59" Apr 24 21:41:09.172555 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.172534 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-d94d7847-vgjkb"] Apr 24 21:41:09.647804 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:09.647773 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" path="/var/lib/kubelet/pods/f5c641c0-e025-421b-a1ac-cba671e2b033/volumes" Apr 24 21:41:17.103374 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:17.103330 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:41:27.103631 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:27.103591 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:41:37.103579 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:37.103539 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:41:47.103848 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:47.103806 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:41:57.103246 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:41:57.103207 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:42:07.104030 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:07.104001 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:42:08.202222 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.202184 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc"] Apr 24 21:42:08.202626 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.202504 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" containerID="cri-o://c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614" gracePeriod=30 Apr 24 21:42:08.202626 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.202550 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kube-rbac-proxy" containerID="cri-o://9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9" gracePeriod=30 Apr 24 21:42:08.317816 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.317783 2571 generic.go:358] "Generic (PLEG): container finished" podID="70d29832-055e-49bd-8770-e6e15e529545" containerID="9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9" exitCode=2 Apr 24 21:42:08.317962 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.317831 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerDied","Data":"9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9"} Apr 24 21:42:08.348039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348009 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk"] Apr 24 21:42:08.348337 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348325 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kube-rbac-proxy" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348338 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kube-rbac-proxy" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348348 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kserve-container" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348353 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kserve-container" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348362 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348367 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348375 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348380 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348388 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="storage-initializer" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348392 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="storage-initializer" Apr 24 21:42:08.348394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348398 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" Apr 24 21:42:08.348711 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348403 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" Apr 24 21:42:08.348711 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348448 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kserve-container" Apr 24 21:42:08.348711 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348456 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kube-rbac-proxy" Apr 24 21:42:08.348711 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348462 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dceaf91-d766-45d9-a809-da6227a9a1b3" containerName="kserve-container" Apr 24 21:42:08.348711 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348470 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="kube-rbac-proxy" Apr 24 21:42:08.348711 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.348477 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5c641c0-e025-421b-a1ac-cba671e2b033" containerName="agent" Apr 24 21:42:08.351399 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.351383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.353994 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.353970 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 24 21:42:08.354374 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.354358 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:42:08.365785 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.365762 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk"] Apr 24 21:42:08.383884 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.383858 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxdrf\" (UniqueName: \"kubernetes.io/projected/70dd3200-b3e7-45df-9b3d-5d7089528784-kube-api-access-jxdrf\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.383986 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.383921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.383986 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.383944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70dd3200-b3e7-45df-9b3d-5d7089528784-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.383986 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.383963 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70dd3200-b3e7-45df-9b3d-5d7089528784-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.484833 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.484751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxdrf\" (UniqueName: \"kubernetes.io/projected/70dd3200-b3e7-45df-9b3d-5d7089528784-kube-api-access-jxdrf\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.484833 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.484796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.484833 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.484812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70dd3200-b3e7-45df-9b3d-5d7089528784-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.485078 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.484841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70dd3200-b3e7-45df-9b3d-5d7089528784-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.485078 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:42:08.484908 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-serving-cert: secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 24 21:42:08.485078 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:42:08.484973 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls podName:70dd3200-b3e7-45df-9b3d-5d7089528784 nodeName:}" failed. No retries permitted until 2026-04-24 21:42:08.984953296 +0000 UTC m=+929.832997392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls") pod "isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" (UID: "70dd3200-b3e7-45df-9b3d-5d7089528784") : secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 24 21:42:08.485282 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.485261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70dd3200-b3e7-45df-9b3d-5d7089528784-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.485545 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.485529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70dd3200-b3e7-45df-9b3d-5d7089528784-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.496507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.496478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxdrf\" (UniqueName: \"kubernetes.io/projected/70dd3200-b3e7-45df-9b3d-5d7089528784-kube-api-access-jxdrf\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.989066 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.989033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:08.991584 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:08.991562 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:09.261445 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:09.261408 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:09.383046 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:09.383015 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk"] Apr 24 21:42:09.387398 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:42:09.387359 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70dd3200_b3e7_45df_9b3d_5d7089528784.slice/crio-1ea21113482a1f9dfafcd76cf9824c77f6cf0c05caf564eda13d6ce50b631e3e WatchSource:0}: Error finding container 1ea21113482a1f9dfafcd76cf9824c77f6cf0c05caf564eda13d6ce50b631e3e: Status 404 returned error can't find the container with id 1ea21113482a1f9dfafcd76cf9824c77f6cf0c05caf564eda13d6ce50b631e3e Apr 24 21:42:10.325705 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:10.325674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerStarted","Data":"8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766"} Apr 24 21:42:10.325705 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:10.325709 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerStarted","Data":"1ea21113482a1f9dfafcd76cf9824c77f6cf0c05caf564eda13d6ce50b631e3e"} Apr 24 21:42:12.098349 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:12.098285 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 24 21:42:13.333664 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.333642 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:42:13.335350 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.335323 2571 generic.go:358] "Generic (PLEG): container finished" podID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerID="8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766" exitCode=0 Apr 24 21:42:13.335441 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.335330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerDied","Data":"8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766"} Apr 24 21:42:13.337200 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.337179 2571 generic.go:358] "Generic (PLEG): container finished" podID="70d29832-055e-49bd-8770-e6e15e529545" containerID="c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614" exitCode=0 Apr 24 21:42:13.337323 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.337226 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerDied","Data":"c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614"} Apr 24 21:42:13.337323 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.337251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" event={"ID":"70d29832-055e-49bd-8770-e6e15e529545","Type":"ContainerDied","Data":"17abaa14a9c66c3b7388527e08f4e0e9d29c16936a39e3f2de85587df6c7a2ba"} Apr 24 21:42:13.337323 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.337260 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc" Apr 24 21:42:13.337460 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.337268 2571 scope.go:117] "RemoveContainer" containerID="9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9" Apr 24 21:42:13.344790 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.344776 2571 scope.go:117] "RemoveContainer" containerID="c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614" Apr 24 21:42:13.352123 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.352093 2571 scope.go:117] "RemoveContainer" containerID="6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f" Apr 24 21:42:13.364329 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.364289 2571 scope.go:117] "RemoveContainer" containerID="9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9" Apr 24 21:42:13.364695 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:42:13.364674 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9\": container with ID starting with 9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9 not found: ID does not exist" containerID="9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9" Apr 24 21:42:13.364798 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.364702 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9"} err="failed to get container status \"9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9\": rpc error: code = NotFound desc = could not find container \"9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9\": container with ID starting with 9b986c669cb22d4a57d4a49274de8c5da689ac9be81674f396b37436d48b7ae9 not found: ID does not exist" Apr 24 21:42:13.364798 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.364720 2571 scope.go:117] "RemoveContainer" containerID="c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614" Apr 24 21:42:13.365012 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:42:13.364994 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614\": container with ID starting with c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614 not found: ID does not exist" containerID="c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614" Apr 24 21:42:13.365072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.365015 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614"} err="failed to get container status \"c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614\": rpc error: code = NotFound desc = could not find container \"c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614\": container with ID starting with c900d6411e501847314c865f77a30c04ce2fac43d46a7c94e133cfb3c5957614 not found: ID does not exist" Apr 24 21:42:13.365072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.365029 2571 scope.go:117] "RemoveContainer" containerID="6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f" Apr 24 21:42:13.365279 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:42:13.365260 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f\": container with ID starting with 6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f not found: ID does not exist" containerID="6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f" Apr 24 21:42:13.365370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.365284 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f"} err="failed to get container status \"6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f\": rpc error: code = NotFound desc = could not find container \"6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f\": container with ID starting with 6d2a3d32f3e2e7f9feae5e2af74e7266b7d250edec51a6989e04e9d15cb8377f not found: ID does not exist" Apr 24 21:42:13.425796 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.425772 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv9wn\" (UniqueName: \"kubernetes.io/projected/70d29832-055e-49bd-8770-e6e15e529545-kube-api-access-qv9wn\") pod \"70d29832-055e-49bd-8770-e6e15e529545\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " Apr 24 21:42:13.425925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.425817 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70d29832-055e-49bd-8770-e6e15e529545-kserve-provision-location\") pod \"70d29832-055e-49bd-8770-e6e15e529545\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " Apr 24 21:42:13.425925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.425857 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls\") pod \"70d29832-055e-49bd-8770-e6e15e529545\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " Apr 24 21:42:13.425925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.425890 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70d29832-055e-49bd-8770-e6e15e529545-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"70d29832-055e-49bd-8770-e6e15e529545\" (UID: \"70d29832-055e-49bd-8770-e6e15e529545\") " Apr 24 21:42:13.426689 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.426212 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d29832-055e-49bd-8770-e6e15e529545-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "70d29832-055e-49bd-8770-e6e15e529545" (UID: "70d29832-055e-49bd-8770-e6e15e529545"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:13.426689 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.426275 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d29832-055e-49bd-8770-e6e15e529545-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "70d29832-055e-49bd-8770-e6e15e529545" (UID: "70d29832-055e-49bd-8770-e6e15e529545"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:42:13.428082 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.428059 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d29832-055e-49bd-8770-e6e15e529545-kube-api-access-qv9wn" (OuterVolumeSpecName: "kube-api-access-qv9wn") pod "70d29832-055e-49bd-8770-e6e15e529545" (UID: "70d29832-055e-49bd-8770-e6e15e529545"). InnerVolumeSpecName "kube-api-access-qv9wn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:42:13.428161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.428138 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "70d29832-055e-49bd-8770-e6e15e529545" (UID: "70d29832-055e-49bd-8770-e6e15e529545"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:42:13.527370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.527251 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qv9wn\" (UniqueName: \"kubernetes.io/projected/70d29832-055e-49bd-8770-e6e15e529545-kube-api-access-qv9wn\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:42:13.527370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.527283 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70d29832-055e-49bd-8770-e6e15e529545-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:42:13.527370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.527307 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70d29832-055e-49bd-8770-e6e15e529545-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:42:13.527370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.527318 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70d29832-055e-49bd-8770-e6e15e529545-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:42:13.659029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.659000 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc"] Apr 24 21:42:13.664653 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:13.664629 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-l5wbc"] Apr 24 21:42:14.342359 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:14.342262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerStarted","Data":"75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9"} Apr 24 21:42:14.342359 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:14.342314 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerStarted","Data":"58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1"} Apr 24 21:42:14.342788 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:14.342604 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:14.342788 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:14.342745 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:14.343990 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:14.343958 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:42:14.363160 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:14.363115 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podStartSLOduration=6.363104237 podStartE2EDuration="6.363104237s" podCreationTimestamp="2026-04-24 21:42:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:42:14.36093932 +0000 UTC m=+935.208983459" watchObservedRunningTime="2026-04-24 21:42:14.363104237 +0000 UTC m=+935.211148353" Apr 24 21:42:15.344805 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:15.344767 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:42:15.648326 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:15.648213 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d29832-055e-49bd-8770-e6e15e529545" path="/var/lib/kubelet/pods/70d29832-055e-49bd-8770-e6e15e529545/volumes" Apr 24 21:42:20.349507 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:20.349480 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:42:20.350044 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:20.350019 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:42:30.350867 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:30.350821 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:42:40.350635 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:40.350593 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:42:50.350613 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:42:50.350566 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:43:00.350367 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:00.350266 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:43:10.350836 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:10.350792 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:43:20.350738 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:20.350690 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:43:30.351003 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:30.350973 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:43:38.591971 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.591936 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk"] Apr 24 21:43:38.592438 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.592367 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" containerID="cri-o://58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1" gracePeriod=30 Apr 24 21:43:38.592438 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.592419 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kube-rbac-proxy" containerID="cri-o://75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9" gracePeriod=30 Apr 24 21:43:38.704992 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.704961 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv"] Apr 24 21:43:38.705261 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705249 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" Apr 24 21:43:38.705331 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705264 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" Apr 24 21:43:38.705331 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705275 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kube-rbac-proxy" Apr 24 21:43:38.705331 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705280 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kube-rbac-proxy" Apr 24 21:43:38.705331 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705316 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="storage-initializer" Apr 24 21:43:38.705331 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705325 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="storage-initializer" Apr 24 21:43:38.705571 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705392 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kube-rbac-proxy" Apr 24 21:43:38.705571 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.705402 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="70d29832-055e-49bd-8770-e6e15e529545" containerName="kserve-container" Apr 24 21:43:38.708403 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.708387 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.710763 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.710743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:43:38.710872 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.710835 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 24 21:43:38.717150 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.717128 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv"] Apr 24 21:43:38.768475 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.768424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw77\" (UniqueName: \"kubernetes.io/projected/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kube-api-access-2gw77\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.773020 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.772976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e4cd5c2-7be7-4601-ae00-9cc966127b19-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.773188 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.773087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.773332 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.773281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e4cd5c2-7be7-4601-ae00-9cc966127b19-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.874622 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.874541 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e4cd5c2-7be7-4601-ae00-9cc966127b19-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.874622 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.874601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw77\" (UniqueName: \"kubernetes.io/projected/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kube-api-access-2gw77\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.874804 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.874634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e4cd5c2-7be7-4601-ae00-9cc966127b19-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.874804 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.874653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.875130 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.875101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.875422 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.875403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e4cd5c2-7be7-4601-ae00-9cc966127b19-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.877161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.877133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e4cd5c2-7be7-4601-ae00-9cc966127b19-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:38.883321 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:38.883277 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw77\" (UniqueName: \"kubernetes.io/projected/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kube-api-access-2gw77\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:39.018094 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:39.018062 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:43:39.151468 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:39.151257 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv"] Apr 24 21:43:39.154175 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:43:39.154144 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e4cd5c2_7be7_4601_ae00_9cc966127b19.slice/crio-ed06ee8d4d539daf20249a4cd27fc67821f2bf79f38655c9b960027be698da28 WatchSource:0}: Error finding container ed06ee8d4d539daf20249a4cd27fc67821f2bf79f38655c9b960027be698da28: Status 404 returned error can't find the container with id ed06ee8d4d539daf20249a4cd27fc67821f2bf79f38655c9b960027be698da28 Apr 24 21:43:39.575120 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:39.575083 2571 generic.go:358] "Generic (PLEG): container finished" podID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerID="75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9" exitCode=2 Apr 24 21:43:39.575311 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:39.575166 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerDied","Data":"75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9"} Apr 24 21:43:39.576465 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:39.576440 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerStarted","Data":"bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1"} Apr 24 21:43:39.576568 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:39.576473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerStarted","Data":"ed06ee8d4d539daf20249a4cd27fc67821f2bf79f38655c9b960027be698da28"} Apr 24 21:43:40.345222 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:40.345183 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 21:43:40.350544 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:40.350510 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:43:43.588205 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:43.588169 2571 generic.go:358] "Generic (PLEG): container finished" podID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerID="bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1" exitCode=0 Apr 24 21:43:43.588577 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:43.588239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerDied","Data":"bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1"} Apr 24 21:43:44.051705 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.051679 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:43:44.117463 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.117428 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70dd3200-b3e7-45df-9b3d-5d7089528784-kserve-provision-location\") pod \"70dd3200-b3e7-45df-9b3d-5d7089528784\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " Apr 24 21:43:44.117663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.117517 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxdrf\" (UniqueName: \"kubernetes.io/projected/70dd3200-b3e7-45df-9b3d-5d7089528784-kube-api-access-jxdrf\") pod \"70dd3200-b3e7-45df-9b3d-5d7089528784\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " Apr 24 21:43:44.117663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.117548 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70dd3200-b3e7-45df-9b3d-5d7089528784-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"70dd3200-b3e7-45df-9b3d-5d7089528784\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " Apr 24 21:43:44.117663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.117589 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls\") pod \"70dd3200-b3e7-45df-9b3d-5d7089528784\" (UID: \"70dd3200-b3e7-45df-9b3d-5d7089528784\") " Apr 24 21:43:44.117900 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.117857 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70dd3200-b3e7-45df-9b3d-5d7089528784-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "70dd3200-b3e7-45df-9b3d-5d7089528784" (UID: "70dd3200-b3e7-45df-9b3d-5d7089528784"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:44.118008 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.117976 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70dd3200-b3e7-45df-9b3d-5d7089528784-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "70dd3200-b3e7-45df-9b3d-5d7089528784" (UID: "70dd3200-b3e7-45df-9b3d-5d7089528784"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:43:44.123236 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.122802 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70dd3200-b3e7-45df-9b3d-5d7089528784-kube-api-access-jxdrf" (OuterVolumeSpecName: "kube-api-access-jxdrf") pod "70dd3200-b3e7-45df-9b3d-5d7089528784" (UID: "70dd3200-b3e7-45df-9b3d-5d7089528784"). InnerVolumeSpecName "kube-api-access-jxdrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:44.124560 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.124497 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "70dd3200-b3e7-45df-9b3d-5d7089528784" (UID: "70dd3200-b3e7-45df-9b3d-5d7089528784"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:44.218527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.218384 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70dd3200-b3e7-45df-9b3d-5d7089528784-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:43:44.218527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.218420 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxdrf\" (UniqueName: \"kubernetes.io/projected/70dd3200-b3e7-45df-9b3d-5d7089528784-kube-api-access-jxdrf\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:43:44.218527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.218439 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/70dd3200-b3e7-45df-9b3d-5d7089528784-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:43:44.218527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.218455 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70dd3200-b3e7-45df-9b3d-5d7089528784-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:43:44.594830 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.593547 2571 generic.go:358] "Generic (PLEG): container finished" podID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerID="58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1" exitCode=0 Apr 24 21:43:44.594830 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.593594 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerDied","Data":"58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1"} Apr 24 21:43:44.594830 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.593624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" event={"ID":"70dd3200-b3e7-45df-9b3d-5d7089528784","Type":"ContainerDied","Data":"1ea21113482a1f9dfafcd76cf9824c77f6cf0c05caf564eda13d6ce50b631e3e"} Apr 24 21:43:44.594830 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.593646 2571 scope.go:117] "RemoveContainer" containerID="75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9" Apr 24 21:43:44.594830 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.593812 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk" Apr 24 21:43:44.610195 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.610172 2571 scope.go:117] "RemoveContainer" containerID="58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1" Apr 24 21:43:44.624981 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.624820 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk"] Apr 24 21:43:44.625057 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.624993 2571 scope.go:117] "RemoveContainer" containerID="8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766" Apr 24 21:43:44.628730 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.628689 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bbswk"] Apr 24 21:43:44.637317 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.636695 2571 scope.go:117] "RemoveContainer" containerID="75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9" Apr 24 21:43:44.637317 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:43:44.637231 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9\": container with ID starting with 75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9 not found: ID does not exist" containerID="75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9" Apr 24 21:43:44.637317 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.637265 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9"} err="failed to get container status \"75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9\": rpc error: code = NotFound desc = could not find container \"75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9\": container with ID starting with 75412ab87a714ae6ed8136f1cb9a26e62bd7e31e5b699d78d20c5c97dccc5ac9 not found: ID does not exist" Apr 24 21:43:44.637317 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.637310 2571 scope.go:117] "RemoveContainer" containerID="58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1" Apr 24 21:43:44.638678 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:43:44.637594 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1\": container with ID starting with 58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1 not found: ID does not exist" containerID="58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1" Apr 24 21:43:44.638678 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.637624 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1"} err="failed to get container status \"58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1\": rpc error: code = NotFound desc = could not find container \"58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1\": container with ID starting with 58bef2f37683435824e7838a64c28e4d005a8fb3e0bbf326cf6774c3b86032b1 not found: ID does not exist" Apr 24 21:43:44.638678 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.637645 2571 scope.go:117] "RemoveContainer" containerID="8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766" Apr 24 21:43:44.638678 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:43:44.638176 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766\": container with ID starting with 8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766 not found: ID does not exist" containerID="8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766" Apr 24 21:43:44.638678 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:44.638205 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766"} err="failed to get container status \"8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766\": rpc error: code = NotFound desc = could not find container \"8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766\": container with ID starting with 8ceb670668803e46d3d37b2220a16a095e059583446b52b0d5c57823d6a56766 not found: ID does not exist" Apr 24 21:43:45.651239 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:43:45.650795 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" path="/var/lib/kubelet/pods/70dd3200-b3e7-45df-9b3d-5d7089528784/volumes" Apr 24 21:45:51.525630 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:45:51.525608 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:45:51.984948 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:45:51.984915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerStarted","Data":"b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d"} Apr 24 21:45:51.984948 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:45:51.984952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerStarted","Data":"bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6"} Apr 24 21:45:51.985170 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:45:51.985068 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:45:52.013764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:45:52.013717 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" podStartSLOduration=6.187739112 podStartE2EDuration="2m14.013703365s" podCreationTimestamp="2026-04-24 21:43:38 +0000 UTC" firstStartedPulling="2026-04-24 21:43:43.58931451 +0000 UTC m=+1024.437358604" lastFinishedPulling="2026-04-24 21:45:51.415278762 +0000 UTC m=+1152.263322857" observedRunningTime="2026-04-24 21:45:52.012997412 +0000 UTC m=+1152.861041529" watchObservedRunningTime="2026-04-24 21:45:52.013703365 +0000 UTC m=+1152.861747481" Apr 24 21:45:52.988151 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:45:52.988118 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:45:58.997330 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:45:58.997231 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:46:29.001350 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:29.001321 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:46:38.860744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.860696 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv"] Apr 24 21:46:38.861390 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.861112 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kserve-container" containerID="cri-o://bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6" gracePeriod=30 Apr 24 21:46:38.861390 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.861162 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kube-rbac-proxy" containerID="cri-o://b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d" gracePeriod=30 Apr 24 21:46:38.968771 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.968738 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w"] Apr 24 21:46:38.969080 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969067 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="storage-initializer" Apr 24 21:46:38.969080 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969081 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="storage-initializer" Apr 24 21:46:38.969174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969092 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kube-rbac-proxy" Apr 24 21:46:38.969174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969098 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kube-rbac-proxy" Apr 24 21:46:38.969174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969105 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" Apr 24 21:46:38.969174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969110 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" Apr 24 21:46:38.969174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969157 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kube-rbac-proxy" Apr 24 21:46:38.969174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.969165 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="70dd3200-b3e7-45df-9b3d-5d7089528784" containerName="kserve-container" Apr 24 21:46:38.972247 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.972224 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:38.974605 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.974577 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 24 21:46:38.974732 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.974578 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:46:38.982673 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.982648 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w"] Apr 24 21:46:38.992424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:38.992395 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 24 21:46:39.059072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.059035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5mw\" (UniqueName: \"kubernetes.io/projected/87645a8d-7741-40a4-99b8-38ab9ecb1671-kube-api-access-jj5mw\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.059072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.059079 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87645a8d-7741-40a4-99b8-38ab9ecb1671-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.059351 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.059139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.059351 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.059188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87645a8d-7741-40a4-99b8-38ab9ecb1671-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.119476 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.119394 2571 generic.go:358] "Generic (PLEG): container finished" podID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerID="b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d" exitCode=2 Apr 24 21:46:39.119476 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.119460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerDied","Data":"b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d"} Apr 24 21:46:39.160494 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.160456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5mw\" (UniqueName: \"kubernetes.io/projected/87645a8d-7741-40a4-99b8-38ab9ecb1671-kube-api-access-jj5mw\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.160494 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.160498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87645a8d-7741-40a4-99b8-38ab9ecb1671-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.160741 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.160519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.160741 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.160557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87645a8d-7741-40a4-99b8-38ab9ecb1671-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.160741 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:46:39.160631 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 24 21:46:39.160741 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:46:39.160705 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls podName:87645a8d-7741-40a4-99b8-38ab9ecb1671 nodeName:}" failed. No retries permitted until 2026-04-24 21:46:39.660685468 +0000 UTC m=+1200.508729562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" (UID: "87645a8d-7741-40a4-99b8-38ab9ecb1671") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 24 21:46:39.161043 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.161021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87645a8d-7741-40a4-99b8-38ab9ecb1671-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.161217 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.161201 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87645a8d-7741-40a4-99b8-38ab9ecb1671-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.169174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.169142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5mw\" (UniqueName: \"kubernetes.io/projected/87645a8d-7741-40a4-99b8-38ab9ecb1671-kube-api-access-jj5mw\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.665467 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.665433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.667903 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.667877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.885079 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.885036 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:39.910260 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.910237 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:46:39.968919 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.968888 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e4cd5c2-7be7-4601-ae00-9cc966127b19-proxy-tls\") pod \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " Apr 24 21:46:39.969072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.968942 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e4cd5c2-7be7-4601-ae00-9cc966127b19-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " Apr 24 21:46:39.969072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.968981 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gw77\" (UniqueName: \"kubernetes.io/projected/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kube-api-access-2gw77\") pod \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " Apr 24 21:46:39.969072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.969008 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kserve-provision-location\") pod \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\" (UID: \"3e4cd5c2-7be7-4601-ae00-9cc966127b19\") " Apr 24 21:46:39.969517 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.969483 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4cd5c2-7be7-4601-ae00-9cc966127b19-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "3e4cd5c2-7be7-4601-ae00-9cc966127b19" (UID: "3e4cd5c2-7be7-4601-ae00-9cc966127b19"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:46:39.969639 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.969534 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3e4cd5c2-7be7-4601-ae00-9cc966127b19" (UID: "3e4cd5c2-7be7-4601-ae00-9cc966127b19"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:39.973725 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.973698 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e4cd5c2-7be7-4601-ae00-9cc966127b19-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3e4cd5c2-7be7-4601-ae00-9cc966127b19" (UID: "3e4cd5c2-7be7-4601-ae00-9cc966127b19"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:39.974853 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:39.974828 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kube-api-access-2gw77" (OuterVolumeSpecName: "kube-api-access-2gw77") pod "3e4cd5c2-7be7-4601-ae00-9cc966127b19" (UID: "3e4cd5c2-7be7-4601-ae00-9cc966127b19"). InnerVolumeSpecName "kube-api-access-2gw77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:40.012138 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.012108 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w"] Apr 24 21:46:40.015200 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:46:40.015171 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87645a8d_7741_40a4_99b8_38ab9ecb1671.slice/crio-a8e372f0e6bd6ff5172881bb255908e07d253b2d74bd4c2b2f701c2d8beba771 WatchSource:0}: Error finding container a8e372f0e6bd6ff5172881bb255908e07d253b2d74bd4c2b2f701c2d8beba771: Status 404 returned error can't find the container with id a8e372f0e6bd6ff5172881bb255908e07d253b2d74bd4c2b2f701c2d8beba771 Apr 24 21:46:40.069599 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.069577 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e4cd5c2-7be7-4601-ae00-9cc966127b19-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:46:40.069686 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.069600 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e4cd5c2-7be7-4601-ae00-9cc966127b19-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:46:40.069686 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.069611 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gw77\" (UniqueName: \"kubernetes.io/projected/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kube-api-access-2gw77\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:46:40.069686 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.069621 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4cd5c2-7be7-4601-ae00-9cc966127b19-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:46:40.124871 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.124837 2571 generic.go:358] "Generic (PLEG): container finished" podID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerID="bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6" exitCode=0 Apr 24 21:46:40.125022 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.124919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerDied","Data":"bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6"} Apr 24 21:46:40.125022 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.124926 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" Apr 24 21:46:40.125022 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.124977 2571 scope.go:117] "RemoveContainer" containerID="b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d" Apr 24 21:46:40.125160 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.124964 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv" event={"ID":"3e4cd5c2-7be7-4601-ae00-9cc966127b19","Type":"ContainerDied","Data":"ed06ee8d4d539daf20249a4cd27fc67821f2bf79f38655c9b960027be698da28"} Apr 24 21:46:40.126440 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.126416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerStarted","Data":"5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2"} Apr 24 21:46:40.126523 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.126441 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerStarted","Data":"a8e372f0e6bd6ff5172881bb255908e07d253b2d74bd4c2b2f701c2d8beba771"} Apr 24 21:46:40.135546 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.135528 2571 scope.go:117] "RemoveContainer" containerID="bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6" Apr 24 21:46:40.143145 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.143130 2571 scope.go:117] "RemoveContainer" containerID="bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1" Apr 24 21:46:40.151250 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.151231 2571 scope.go:117] "RemoveContainer" containerID="b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d" Apr 24 21:46:40.151532 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:46:40.151513 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d\": container with ID starting with b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d not found: ID does not exist" containerID="b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d" Apr 24 21:46:40.151600 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.151540 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d"} err="failed to get container status \"b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d\": rpc error: code = NotFound desc = could not find container \"b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d\": container with ID starting with b6f660bd6be75aa91f2f82daea9ab0c579eb1f82a76f73727521a0583038123d not found: ID does not exist" Apr 24 21:46:40.151600 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.151561 2571 scope.go:117] "RemoveContainer" containerID="bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6" Apr 24 21:46:40.151824 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:46:40.151805 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6\": container with ID starting with bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6 not found: ID does not exist" containerID="bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6" Apr 24 21:46:40.151873 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.151831 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6"} err="failed to get container status \"bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6\": rpc error: code = NotFound desc = could not find container \"bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6\": container with ID starting with bcb5dfa3338f1e243f01b53f4671866411aef10595a64423260fb34e2449f3c6 not found: ID does not exist" Apr 24 21:46:40.151873 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.151849 2571 scope.go:117] "RemoveContainer" containerID="bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1" Apr 24 21:46:40.152075 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:46:40.152059 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1\": container with ID starting with bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1 not found: ID does not exist" containerID="bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1" Apr 24 21:46:40.152125 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.152078 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1"} err="failed to get container status \"bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1\": rpc error: code = NotFound desc = could not find container \"bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1\": container with ID starting with bb27436a5af1583c69d32b63cbb510e031d5d70c8c7935ef9a32146184a761c1 not found: ID does not exist" Apr 24 21:46:40.159731 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.159708 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv"] Apr 24 21:46:40.162869 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:40.162845 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-cnmvv"] Apr 24 21:46:41.648163 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:41.648130 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" path="/var/lib/kubelet/pods/3e4cd5c2-7be7-4601-ae00-9cc966127b19/volumes" Apr 24 21:46:44.140557 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:44.140465 2571 generic.go:358] "Generic (PLEG): container finished" podID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerID="5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2" exitCode=0 Apr 24 21:46:44.140933 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:44.140548 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerDied","Data":"5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2"} Apr 24 21:46:45.145742 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:45.145705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerStarted","Data":"91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303"} Apr 24 21:46:45.145742 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:45.145747 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerStarted","Data":"de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0"} Apr 24 21:46:45.146273 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:45.146028 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:45.146273 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:45.146157 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:45.147341 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:45.147318 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:46:45.165170 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:45.165121 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" podStartSLOduration=7.165104422 podStartE2EDuration="7.165104422s" podCreationTimestamp="2026-04-24 21:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:45.163816629 +0000 UTC m=+1206.011860744" watchObservedRunningTime="2026-04-24 21:46:45.165104422 +0000 UTC m=+1206.013148538" Apr 24 21:46:46.148976 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:46.148924 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:46:51.153865 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:51.153835 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:46:51.154483 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:46:51.154455 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:47:01.155463 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:01.155429 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:47:09.048787 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.048744 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w"] Apr 24 21:47:09.049254 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.049180 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kserve-container" containerID="cri-o://de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0" gracePeriod=30 Apr 24 21:47:09.049355 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.049225 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kube-rbac-proxy" containerID="cri-o://91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303" gracePeriod=30 Apr 24 21:47:09.155698 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.155660 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4"] Apr 24 21:47:09.155982 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.155970 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kube-rbac-proxy" Apr 24 21:47:09.156025 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.155987 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kube-rbac-proxy" Apr 24 21:47:09.156025 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.156006 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kserve-container" Apr 24 21:47:09.156025 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.156011 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kserve-container" Apr 24 21:47:09.156117 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.156026 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="storage-initializer" Apr 24 21:47:09.156117 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.156031 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="storage-initializer" Apr 24 21:47:09.156117 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.156082 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kserve-container" Apr 24 21:47:09.156117 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.156094 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e4cd5c2-7be7-4601-ae00-9cc966127b19" containerName="kube-rbac-proxy" Apr 24 21:47:09.159284 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.159259 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.161901 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.161877 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 24 21:47:09.162164 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.162148 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:47:09.170792 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.170767 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4"] Apr 24 21:47:09.209452 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.209400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5xc\" (UniqueName: \"kubernetes.io/projected/86edb432-6d7e-415f-9d15-3711f64b138b-kube-api-access-zh5xc\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.209650 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.209482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86edb432-6d7e-415f-9d15-3711f64b138b-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.209650 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.209514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.209650 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.209536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86edb432-6d7e-415f-9d15-3711f64b138b-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.221956 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.221921 2571 generic.go:358] "Generic (PLEG): container finished" podID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerID="91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303" exitCode=2 Apr 24 21:47:09.222116 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.221985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerDied","Data":"91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303"} Apr 24 21:47:09.310627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.310535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86edb432-6d7e-415f-9d15-3711f64b138b-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.310627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.310589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.310627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.310617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86edb432-6d7e-415f-9d15-3711f64b138b-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.310884 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.310679 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5xc\" (UniqueName: \"kubernetes.io/projected/86edb432-6d7e-415f-9d15-3711f64b138b-kube-api-access-zh5xc\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.310884 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:47:09.310786 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 24 21:47:09.310884 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:47:09.310869 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls podName:86edb432-6d7e-415f-9d15-3711f64b138b nodeName:}" failed. No retries permitted until 2026-04-24 21:47:09.810846272 +0000 UTC m=+1230.658890366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" (UID: "86edb432-6d7e-415f-9d15-3711f64b138b") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 24 21:47:09.311030 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.311008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86edb432-6d7e-415f-9d15-3711f64b138b-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.311454 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.311428 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86edb432-6d7e-415f-9d15-3711f64b138b-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.321354 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.321319 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5xc\" (UniqueName: \"kubernetes.io/projected/86edb432-6d7e-415f-9d15-3711f64b138b-kube-api-access-zh5xc\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.815453 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.815393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.818393 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.818362 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:09.887011 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.886986 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:47:09.916925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.916883 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls\") pod \"87645a8d-7741-40a4-99b8-38ab9ecb1671\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " Apr 24 21:47:09.917101 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.916950 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87645a8d-7741-40a4-99b8-38ab9ecb1671-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"87645a8d-7741-40a4-99b8-38ab9ecb1671\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " Apr 24 21:47:09.917101 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.916984 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj5mw\" (UniqueName: \"kubernetes.io/projected/87645a8d-7741-40a4-99b8-38ab9ecb1671-kube-api-access-jj5mw\") pod \"87645a8d-7741-40a4-99b8-38ab9ecb1671\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " Apr 24 21:47:09.917101 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.917021 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87645a8d-7741-40a4-99b8-38ab9ecb1671-kserve-provision-location\") pod \"87645a8d-7741-40a4-99b8-38ab9ecb1671\" (UID: \"87645a8d-7741-40a4-99b8-38ab9ecb1671\") " Apr 24 21:47:09.917447 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.917419 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87645a8d-7741-40a4-99b8-38ab9ecb1671-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "87645a8d-7741-40a4-99b8-38ab9ecb1671" (UID: "87645a8d-7741-40a4-99b8-38ab9ecb1671"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:09.917572 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.917461 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87645a8d-7741-40a4-99b8-38ab9ecb1671-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87645a8d-7741-40a4-99b8-38ab9ecb1671" (UID: "87645a8d-7741-40a4-99b8-38ab9ecb1671"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:09.919265 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.919236 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87645a8d-7741-40a4-99b8-38ab9ecb1671-kube-api-access-jj5mw" (OuterVolumeSpecName: "kube-api-access-jj5mw") pod "87645a8d-7741-40a4-99b8-38ab9ecb1671" (UID: "87645a8d-7741-40a4-99b8-38ab9ecb1671"). InnerVolumeSpecName "kube-api-access-jj5mw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:09.919723 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:09.919706 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87645a8d-7741-40a4-99b8-38ab9ecb1671" (UID: "87645a8d-7741-40a4-99b8-38ab9ecb1671"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:10.018275 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.018237 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87645a8d-7741-40a4-99b8-38ab9ecb1671-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:47:10.018275 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.018271 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87645a8d-7741-40a4-99b8-38ab9ecb1671-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:47:10.018497 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.018283 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jj5mw\" (UniqueName: \"kubernetes.io/projected/87645a8d-7741-40a4-99b8-38ab9ecb1671-kube-api-access-jj5mw\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:47:10.018497 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.018326 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87645a8d-7741-40a4-99b8-38ab9ecb1671-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:47:10.069518 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.069458 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:10.203044 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.203019 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4"] Apr 24 21:47:10.205356 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:47:10.205321 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86edb432_6d7e_415f_9d15_3711f64b138b.slice/crio-e54c4da138693737fafb144a2e8a46eeea6375ee1468cb41d0b6aa0bf64e452a WatchSource:0}: Error finding container e54c4da138693737fafb144a2e8a46eeea6375ee1468cb41d0b6aa0bf64e452a: Status 404 returned error can't find the container with id e54c4da138693737fafb144a2e8a46eeea6375ee1468cb41d0b6aa0bf64e452a Apr 24 21:47:10.226992 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.226959 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerStarted","Data":"e54c4da138693737fafb144a2e8a46eeea6375ee1468cb41d0b6aa0bf64e452a"} Apr 24 21:47:10.228759 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.228729 2571 generic.go:358] "Generic (PLEG): container finished" podID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerID="de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0" exitCode=0 Apr 24 21:47:10.228916 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.228766 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerDied","Data":"de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0"} Apr 24 21:47:10.228916 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.228799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" event={"ID":"87645a8d-7741-40a4-99b8-38ab9ecb1671","Type":"ContainerDied","Data":"a8e372f0e6bd6ff5172881bb255908e07d253b2d74bd4c2b2f701c2d8beba771"} Apr 24 21:47:10.228916 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.228825 2571 scope.go:117] "RemoveContainer" containerID="91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303" Apr 24 21:47:10.228916 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.228828 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w" Apr 24 21:47:10.240328 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.240287 2571 scope.go:117] "RemoveContainer" containerID="de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0" Apr 24 21:47:10.248900 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.248873 2571 scope.go:117] "RemoveContainer" containerID="5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2" Apr 24 21:47:10.251165 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.251138 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w"] Apr 24 21:47:10.255063 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.255036 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j269w"] Apr 24 21:47:10.257890 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.257873 2571 scope.go:117] "RemoveContainer" containerID="91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303" Apr 24 21:47:10.258155 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:47:10.258139 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303\": container with ID starting with 91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303 not found: ID does not exist" containerID="91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303" Apr 24 21:47:10.258204 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.258165 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303"} err="failed to get container status \"91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303\": rpc error: code = NotFound desc = could not find container \"91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303\": container with ID starting with 91972c7dcc81eb3393d27eb6201cd982d011fd7ba821b2c960afca7fddb88303 not found: ID does not exist" Apr 24 21:47:10.258204 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.258184 2571 scope.go:117] "RemoveContainer" containerID="de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0" Apr 24 21:47:10.258407 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:47:10.258386 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0\": container with ID starting with de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0 not found: ID does not exist" containerID="de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0" Apr 24 21:47:10.258512 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.258408 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0"} err="failed to get container status \"de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0\": rpc error: code = NotFound desc = could not find container \"de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0\": container with ID starting with de84189534d09e30ae0fb46930583e0205001a8b615b68ca07ccc1411449c1b0 not found: ID does not exist" Apr 24 21:47:10.258512 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.258421 2571 scope.go:117] "RemoveContainer" containerID="5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2" Apr 24 21:47:10.258670 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:47:10.258656 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2\": container with ID starting with 5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2 not found: ID does not exist" containerID="5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2" Apr 24 21:47:10.258729 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:10.258673 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2"} err="failed to get container status \"5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2\": rpc error: code = NotFound desc = could not find container \"5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2\": container with ID starting with 5915e5625824302d0cdc285ab558da4e9b28292e87c3c2f5594217b2186165c2 not found: ID does not exist" Apr 24 21:47:11.233941 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:11.233904 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerStarted","Data":"921472cdc5d59181847d082a37353fe450d9ad641464fa08979312fb64f52600"} Apr 24 21:47:11.647670 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:11.647627 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" path="/var/lib/kubelet/pods/87645a8d-7741-40a4-99b8-38ab9ecb1671/volumes" Apr 24 21:47:14.248578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:14.248541 2571 generic.go:358] "Generic (PLEG): container finished" podID="86edb432-6d7e-415f-9d15-3711f64b138b" containerID="921472cdc5d59181847d082a37353fe450d9ad641464fa08979312fb64f52600" exitCode=0 Apr 24 21:47:14.249050 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:14.248620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerDied","Data":"921472cdc5d59181847d082a37353fe450d9ad641464fa08979312fb64f52600"} Apr 24 21:47:15.256466 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:15.256430 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerStarted","Data":"b73ecc7e3ca83c8a27440daded60b0a39dc176c79325dd825f576388490ddabe"} Apr 24 21:47:15.256859 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:15.256476 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerStarted","Data":"3f1303121084fe9f976725ba80ce4ed07c95a139eaaee63174f81b08bdcd9cdf"} Apr 24 21:47:15.256859 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:15.256724 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:15.256859 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:15.256761 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:15.276990 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:15.276928 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" podStartSLOduration=6.276908767 podStartE2EDuration="6.276908767s" podCreationTimestamp="2026-04-24 21:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:15.276674632 +0000 UTC m=+1236.124718748" watchObservedRunningTime="2026-04-24 21:47:15.276908767 +0000 UTC m=+1236.124952883" Apr 24 21:47:21.265876 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:21.265844 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:51.270165 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:51.270136 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:47:59.202393 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.202352 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4"] Apr 24 21:47:59.202837 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.202648 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kserve-container" containerID="cri-o://3f1303121084fe9f976725ba80ce4ed07c95a139eaaee63174f81b08bdcd9cdf" gracePeriod=30 Apr 24 21:47:59.202837 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.202694 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kube-rbac-proxy" containerID="cri-o://b73ecc7e3ca83c8a27440daded60b0a39dc176c79325dd825f576388490ddabe" gracePeriod=30 Apr 24 21:47:59.380872 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.380832 2571 generic.go:358] "Generic (PLEG): container finished" podID="86edb432-6d7e-415f-9d15-3711f64b138b" containerID="b73ecc7e3ca83c8a27440daded60b0a39dc176c79325dd825f576388490ddabe" exitCode=2 Apr 24 21:47:59.381071 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.380915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerDied","Data":"b73ecc7e3ca83c8a27440daded60b0a39dc176c79325dd825f576388490ddabe"} Apr 24 21:47:59.383072 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383051 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk"] Apr 24 21:47:59.383340 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383328 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="storage-initializer" Apr 24 21:47:59.383395 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383342 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="storage-initializer" Apr 24 21:47:59.383395 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383353 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kube-rbac-proxy" Apr 24 21:47:59.383395 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383360 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kube-rbac-proxy" Apr 24 21:47:59.383395 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383375 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kserve-container" Apr 24 21:47:59.383395 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383381 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kserve-container" Apr 24 21:47:59.383541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383436 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kserve-container" Apr 24 21:47:59.383541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.383446 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="87645a8d-7741-40a4-99b8-38ab9ecb1671" containerName="kube-rbac-proxy" Apr 24 21:47:59.386418 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.386403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.388784 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.388764 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 24 21:47:59.388899 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.388766 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 24 21:47:59.397875 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.397853 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk"] Apr 24 21:47:59.505245 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.505214 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.505423 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.505268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.505423 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.505388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.505498 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.505440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9gr\" (UniqueName: \"kubernetes.io/projected/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kube-api-access-4x9gr\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.606034 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.605993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.606223 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.606071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.606223 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.606115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9gr\" (UniqueName: \"kubernetes.io/projected/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kube-api-access-4x9gr\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.606223 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.606140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.606438 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:47:59.606321 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-serving-cert: secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 24 21:47:59.606438 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:47:59.606394 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls podName:32d8c97b-83b4-4ab4-b849-d5dc0eb569a3 nodeName:}" failed. No retries permitted until 2026-04-24 21:48:00.106371275 +0000 UTC m=+1280.954415373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls") pod "isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" (UID: "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3") : secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 24 21:47:59.606555 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.606536 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.606756 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.606739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:47:59.617006 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:47:59.616984 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9gr\" (UniqueName: \"kubernetes.io/projected/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kube-api-access-4x9gr\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:00.109874 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.109839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:00.112653 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.112620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:00.297126 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.297089 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:00.385948 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.385915 2571 generic.go:358] "Generic (PLEG): container finished" podID="86edb432-6d7e-415f-9d15-3711f64b138b" containerID="3f1303121084fe9f976725ba80ce4ed07c95a139eaaee63174f81b08bdcd9cdf" exitCode=0 Apr 24 21:48:00.386091 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.385983 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerDied","Data":"3f1303121084fe9f976725ba80ce4ed07c95a139eaaee63174f81b08bdcd9cdf"} Apr 24 21:48:00.439890 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.439864 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk"] Apr 24 21:48:00.442468 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:48:00.442438 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32d8c97b_83b4_4ab4_b849_d5dc0eb569a3.slice/crio-86ccbb382ce5e5743e0d2754e7468985fc6a5ce5710633c3846ffd62f796d3b9 WatchSource:0}: Error finding container 86ccbb382ce5e5743e0d2754e7468985fc6a5ce5710633c3846ffd62f796d3b9: Status 404 returned error can't find the container with id 86ccbb382ce5e5743e0d2754e7468985fc6a5ce5710633c3846ffd62f796d3b9 Apr 24 21:48:00.445809 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.445788 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:48:00.613384 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.613350 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86edb432-6d7e-415f-9d15-3711f64b138b-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"86edb432-6d7e-415f-9d15-3711f64b138b\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " Apr 24 21:48:00.613635 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.613407 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh5xc\" (UniqueName: \"kubernetes.io/projected/86edb432-6d7e-415f-9d15-3711f64b138b-kube-api-access-zh5xc\") pod \"86edb432-6d7e-415f-9d15-3711f64b138b\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " Apr 24 21:48:00.613635 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.613441 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86edb432-6d7e-415f-9d15-3711f64b138b-kserve-provision-location\") pod \"86edb432-6d7e-415f-9d15-3711f64b138b\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " Apr 24 21:48:00.613635 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.613461 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls\") pod \"86edb432-6d7e-415f-9d15-3711f64b138b\" (UID: \"86edb432-6d7e-415f-9d15-3711f64b138b\") " Apr 24 21:48:00.613772 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.613744 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86edb432-6d7e-415f-9d15-3711f64b138b-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "86edb432-6d7e-415f-9d15-3711f64b138b" (UID: "86edb432-6d7e-415f-9d15-3711f64b138b"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:00.613823 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.613796 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86edb432-6d7e-415f-9d15-3711f64b138b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86edb432-6d7e-415f-9d15-3711f64b138b" (UID: "86edb432-6d7e-415f-9d15-3711f64b138b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:00.615737 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.615710 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "86edb432-6d7e-415f-9d15-3711f64b138b" (UID: "86edb432-6d7e-415f-9d15-3711f64b138b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:00.615809 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.615739 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86edb432-6d7e-415f-9d15-3711f64b138b-kube-api-access-zh5xc" (OuterVolumeSpecName: "kube-api-access-zh5xc") pod "86edb432-6d7e-415f-9d15-3711f64b138b" (UID: "86edb432-6d7e-415f-9d15-3711f64b138b"). InnerVolumeSpecName "kube-api-access-zh5xc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:00.714497 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.714413 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zh5xc\" (UniqueName: \"kubernetes.io/projected/86edb432-6d7e-415f-9d15-3711f64b138b-kube-api-access-zh5xc\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:48:00.714497 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.714444 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86edb432-6d7e-415f-9d15-3711f64b138b-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:48:00.714497 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.714460 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86edb432-6d7e-415f-9d15-3711f64b138b-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:48:00.714497 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:00.714473 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86edb432-6d7e-415f-9d15-3711f64b138b-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:48:01.389854 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.389810 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerStarted","Data":"84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7"} Apr 24 21:48:01.389854 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.389857 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerStarted","Data":"86ccbb382ce5e5743e0d2754e7468985fc6a5ce5710633c3846ffd62f796d3b9"} Apr 24 21:48:01.391515 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.391486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" event={"ID":"86edb432-6d7e-415f-9d15-3711f64b138b","Type":"ContainerDied","Data":"e54c4da138693737fafb144a2e8a46eeea6375ee1468cb41d0b6aa0bf64e452a"} Apr 24 21:48:01.391515 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.391512 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4" Apr 24 21:48:01.391649 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.391534 2571 scope.go:117] "RemoveContainer" containerID="b73ecc7e3ca83c8a27440daded60b0a39dc176c79325dd825f576388490ddabe" Apr 24 21:48:01.402902 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.402881 2571 scope.go:117] "RemoveContainer" containerID="3f1303121084fe9f976725ba80ce4ed07c95a139eaaee63174f81b08bdcd9cdf" Apr 24 21:48:01.410042 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.410024 2571 scope.go:117] "RemoveContainer" containerID="921472cdc5d59181847d082a37353fe450d9ad641464fa08979312fb64f52600" Apr 24 21:48:01.424494 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.424475 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4"] Apr 24 21:48:01.428360 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.428341 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-8btt4"] Apr 24 21:48:01.648187 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:01.648157 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" path="/var/lib/kubelet/pods/86edb432-6d7e-415f-9d15-3711f64b138b/volumes" Apr 24 21:48:04.402240 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:04.402212 2571 generic.go:358] "Generic (PLEG): container finished" podID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerID="84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7" exitCode=0 Apr 24 21:48:04.402528 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:04.402275 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerDied","Data":"84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7"} Apr 24 21:48:05.413888 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:05.413848 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerStarted","Data":"9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff"} Apr 24 21:48:07.422810 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:07.422775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerStarted","Data":"709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451"} Apr 24 21:48:07.422810 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:07.422810 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerStarted","Data":"56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776"} Apr 24 21:48:07.423246 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:07.422976 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:07.423246 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:07.423068 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:07.453124 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:07.453068 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podStartSLOduration=6.135367581 podStartE2EDuration="8.453054648s" podCreationTimestamp="2026-04-24 21:47:59 +0000 UTC" firstStartedPulling="2026-04-24 21:48:04.487755269 +0000 UTC m=+1285.335799366" lastFinishedPulling="2026-04-24 21:48:06.805442338 +0000 UTC m=+1287.653486433" observedRunningTime="2026-04-24 21:48:07.451441759 +0000 UTC m=+1288.299485875" watchObservedRunningTime="2026-04-24 21:48:07.453054648 +0000 UTC m=+1288.301098765" Apr 24 21:48:08.425644 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:08.425616 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:14.433436 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:14.433403 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:48:34.434724 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:34.434683 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:48:44.435948 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:48:44.435921 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:49:14.436969 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:14.436899 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:49:19.375375 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.375330 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk"] Apr 24 21:49:19.376071 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.375762 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" containerID="cri-o://9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff" gracePeriod=30 Apr 24 21:49:19.376071 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.375818 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" containerID="cri-o://709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451" gracePeriod=30 Apr 24 21:49:19.376071 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.375871 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-agent" containerID="cri-o://56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776" gracePeriod=30 Apr 24 21:49:19.429064 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.429016 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:49:19.452109 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.452075 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5"] Apr 24 21:49:19.453115 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453086 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kube-rbac-proxy" Apr 24 21:49:19.453312 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453277 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kube-rbac-proxy" Apr 24 21:49:19.453389 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453325 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="storage-initializer" Apr 24 21:49:19.453389 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453336 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="storage-initializer" Apr 24 21:49:19.453389 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453360 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kserve-container" Apr 24 21:49:19.453389 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453370 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kserve-container" Apr 24 21:49:19.453577 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453560 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kube-rbac-proxy" Apr 24 21:49:19.453632 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.453586 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="86edb432-6d7e-415f-9d15-3711f64b138b" containerName="kserve-container" Apr 24 21:49:19.456866 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.456835 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.459691 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.459672 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 24 21:49:19.459691 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.459681 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 24 21:49:19.467052 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.467027 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5"] Apr 24 21:49:19.593902 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.593863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2slm5\" (UniqueName: \"kubernetes.io/projected/7dadf643-d1ec-47e8-9b32-9e946d424d06-kube-api-access-2slm5\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.594062 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.593915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dadf643-d1ec-47e8-9b32-9e946d424d06-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.594062 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.593971 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dadf643-d1ec-47e8-9b32-9e946d424d06-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.594062 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.594000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dadf643-d1ec-47e8-9b32-9e946d424d06-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.635178 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.635088 2571 generic.go:358] "Generic (PLEG): container finished" podID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerID="709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451" exitCode=2 Apr 24 21:49:19.635178 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.635160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerDied","Data":"709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451"} Apr 24 21:49:19.695111 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.695076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2slm5\" (UniqueName: \"kubernetes.io/projected/7dadf643-d1ec-47e8-9b32-9e946d424d06-kube-api-access-2slm5\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.695270 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.695124 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dadf643-d1ec-47e8-9b32-9e946d424d06-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.695270 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.695242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dadf643-d1ec-47e8-9b32-9e946d424d06-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.695430 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.695316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dadf643-d1ec-47e8-9b32-9e946d424d06-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.695762 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.695741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dadf643-d1ec-47e8-9b32-9e946d424d06-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.695846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.695741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dadf643-d1ec-47e8-9b32-9e946d424d06-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.697603 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.697587 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dadf643-d1ec-47e8-9b32-9e946d424d06-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.704159 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.704137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2slm5\" (UniqueName: \"kubernetes.io/projected/7dadf643-d1ec-47e8-9b32-9e946d424d06-kube-api-access-2slm5\") pod \"isvc-paddle-predictor-6b8b7cfb4b-wd9c5\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.768055 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.768010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:19.900722 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:19.900700 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5"] Apr 24 21:49:19.903039 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:49:19.903014 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dadf643_d1ec_47e8_9b32_9e946d424d06.slice/crio-8d0a61570bc22bf97b1ba4a78e4cdb3cabf67b7319bbcf4788fac7ba21552bf9 WatchSource:0}: Error finding container 8d0a61570bc22bf97b1ba4a78e4cdb3cabf67b7319bbcf4788fac7ba21552bf9: Status 404 returned error can't find the container with id 8d0a61570bc22bf97b1ba4a78e4cdb3cabf67b7319bbcf4788fac7ba21552bf9 Apr 24 21:49:20.640464 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:20.640425 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerStarted","Data":"83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c"} Apr 24 21:49:20.640464 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:20.640466 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerStarted","Data":"8d0a61570bc22bf97b1ba4a78e4cdb3cabf67b7319bbcf4788fac7ba21552bf9"} Apr 24 21:49:21.645161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:21.645083 2571 generic.go:358] "Generic (PLEG): container finished" podID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerID="9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff" exitCode=0 Apr 24 21:49:21.647090 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:21.647066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerDied","Data":"9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff"} Apr 24 21:49:24.429244 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:24.429210 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:49:24.434611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:24.434590 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:49:24.654428 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:24.654394 2571 generic.go:358] "Generic (PLEG): container finished" podID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerID="83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c" exitCode=0 Apr 24 21:49:24.654584 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:24.654454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerDied","Data":"83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c"} Apr 24 21:49:29.429406 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:29.429361 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:49:29.429866 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:29.429521 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:49:34.428992 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:34.428948 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:49:34.434379 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:34.434348 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:49:36.690977 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:36.690947 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerStarted","Data":"9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301"} Apr 24 21:49:36.690977 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:36.690982 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerStarted","Data":"e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb"} Apr 24 21:49:36.691434 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:36.691320 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:36.691474 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:36.691435 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:36.692696 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:36.692667 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:49:36.711593 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:36.711555 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podStartSLOduration=6.727251057 podStartE2EDuration="17.711544767s" podCreationTimestamp="2026-04-24 21:49:19 +0000 UTC" firstStartedPulling="2026-04-24 21:49:24.655606724 +0000 UTC m=+1365.503650820" lastFinishedPulling="2026-04-24 21:49:35.639900434 +0000 UTC m=+1376.487944530" observedRunningTime="2026-04-24 21:49:36.711262938 +0000 UTC m=+1377.559307052" watchObservedRunningTime="2026-04-24 21:49:36.711544767 +0000 UTC m=+1377.559588881" Apr 24 21:49:37.694421 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:37.694384 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:49:39.428836 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:39.428720 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:49:42.698698 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:42.698672 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:49:42.699102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:42.699076 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:49:44.429054 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:44.429002 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:49:44.434586 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:44.434552 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:49:44.434707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:44.434687 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:49:49.428698 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.428666 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:49:49.559594 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.559569 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:49:49.643051 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.642967 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls\") pod \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " Apr 24 21:49:49.643051 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.643036 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x9gr\" (UniqueName: \"kubernetes.io/projected/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kube-api-access-4x9gr\") pod \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " Apr 24 21:49:49.643273 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.643057 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " Apr 24 21:49:49.643273 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.643089 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kserve-provision-location\") pod \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\" (UID: \"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3\") " Apr 24 21:49:49.643546 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.643502 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" (UID: "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:49.643546 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.643512 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" (UID: "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:49.645392 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.645369 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kube-api-access-4x9gr" (OuterVolumeSpecName: "kube-api-access-4x9gr") pod "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" (UID: "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3"). InnerVolumeSpecName "kube-api-access-4x9gr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:49.645578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.645560 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" (UID: "32d8c97b-83b4-4ab4-b849-d5dc0eb569a3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:49.728744 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.728709 2571 generic.go:358] "Generic (PLEG): container finished" podID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerID="56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776" exitCode=137 Apr 24 21:49:49.728897 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.728791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerDied","Data":"56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776"} Apr 24 21:49:49.728897 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.728822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" event={"ID":"32d8c97b-83b4-4ab4-b849-d5dc0eb569a3","Type":"ContainerDied","Data":"86ccbb382ce5e5743e0d2754e7468985fc6a5ce5710633c3846ffd62f796d3b9"} Apr 24 21:49:49.728897 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.728844 2571 scope.go:117] "RemoveContainer" containerID="709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451" Apr 24 21:49:49.728897 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.728854 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk" Apr 24 21:49:49.736984 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.736962 2571 scope.go:117] "RemoveContainer" containerID="56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776" Apr 24 21:49:49.743959 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.743936 2571 scope.go:117] "RemoveContainer" containerID="9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff" Apr 24 21:49:49.744176 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.744155 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4x9gr\" (UniqueName: \"kubernetes.io/projected/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kube-api-access-4x9gr\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:49:49.744271 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.744182 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:49:49.744271 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.744199 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:49:49.744271 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.744213 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:49:49.747902 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.747878 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk"] Apr 24 21:49:49.750599 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.750579 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-74b9b7ddc5-lhrpk"] Apr 24 21:49:49.751825 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.751807 2571 scope.go:117] "RemoveContainer" containerID="84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7" Apr 24 21:49:49.757981 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.757965 2571 scope.go:117] "RemoveContainer" containerID="709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451" Apr 24 21:49:49.758205 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:49:49.758188 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451\": container with ID starting with 709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451 not found: ID does not exist" containerID="709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451" Apr 24 21:49:49.758249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.758214 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451"} err="failed to get container status \"709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451\": rpc error: code = NotFound desc = could not find container \"709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451\": container with ID starting with 709cc11d2b13c3633791762702be8242133db1301be8f085a06aa83f84c87451 not found: ID does not exist" Apr 24 21:49:49.758249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.758234 2571 scope.go:117] "RemoveContainer" containerID="56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776" Apr 24 21:49:49.758479 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:49:49.758463 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776\": container with ID starting with 56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776 not found: ID does not exist" containerID="56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776" Apr 24 21:49:49.758526 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.758485 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776"} err="failed to get container status \"56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776\": rpc error: code = NotFound desc = could not find container \"56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776\": container with ID starting with 56998f76910336a5ec399512a8731fe578a054c533bb46c0dd04098bcfdaa776 not found: ID does not exist" Apr 24 21:49:49.758526 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.758501 2571 scope.go:117] "RemoveContainer" containerID="9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff" Apr 24 21:49:49.758733 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:49:49.758716 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff\": container with ID starting with 9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff not found: ID does not exist" containerID="9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff" Apr 24 21:49:49.758792 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.758742 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff"} err="failed to get container status \"9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff\": rpc error: code = NotFound desc = could not find container \"9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff\": container with ID starting with 9e5fe244c845f6c819ca4da3ccd57e59c06d5d26c4b80ab3131abdd5c6473cff not found: ID does not exist" Apr 24 21:49:49.758792 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.758764 2571 scope.go:117] "RemoveContainer" containerID="84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7" Apr 24 21:49:49.758961 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:49:49.758945 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7\": container with ID starting with 84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7 not found: ID does not exist" containerID="84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7" Apr 24 21:49:49.758999 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:49.758967 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7"} err="failed to get container status \"84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7\": rpc error: code = NotFound desc = could not find container \"84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7\": container with ID starting with 84584b8c2c3444f352f6924f4ca983e505babcd1749607b2118d0262c99cf1e7 not found: ID does not exist" Apr 24 21:49:51.648611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:51.648573 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" path="/var/lib/kubelet/pods/32d8c97b-83b4-4ab4-b849-d5dc0eb569a3/volumes" Apr 24 21:49:52.699684 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:49:52.699648 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:50:02.699963 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:02.699924 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:50:12.699098 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:12.699056 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:50:22.699781 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:22.699754 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:50:30.780832 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.780748 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5"] Apr 24 21:50:30.781202 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.781055 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" containerID="cri-o://e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb" gracePeriod=30 Apr 24 21:50:30.781202 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.781118 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kube-rbac-proxy" containerID="cri-o://9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301" gracePeriod=30 Apr 24 21:50:30.854108 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854083 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj"] Apr 24 21:50:30.854432 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854420 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="storage-initializer" Apr 24 21:50:30.854477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854435 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="storage-initializer" Apr 24 21:50:30.854477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854444 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" Apr 24 21:50:30.854477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854450 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" Apr 24 21:50:30.854477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854461 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" Apr 24 21:50:30.854477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854469 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" Apr 24 21:50:30.854477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854479 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-agent" Apr 24 21:50:30.854648 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854484 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-agent" Apr 24 21:50:30.854648 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854554 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-container" Apr 24 21:50:30.854648 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854564 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kserve-agent" Apr 24 21:50:30.854648 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.854570 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="32d8c97b-83b4-4ab4-b849-d5dc0eb569a3" containerName="kube-rbac-proxy" Apr 24 21:50:30.866982 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.866962 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:30.870214 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.870191 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:50:30.870483 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.870465 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 24 21:50:30.871573 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.871553 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj"] Apr 24 21:50:30.965302 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.965276 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e691a15a-3026-498d-ae2e-6fd06b6ab504-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:30.965404 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.965344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e691a15a-3026-498d-ae2e-6fd06b6ab504-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:30.965404 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.965381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e691a15a-3026-498d-ae2e-6fd06b6ab504-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:30.965404 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:30.965401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn899\" (UniqueName: \"kubernetes.io/projected/e691a15a-3026-498d-ae2e-6fd06b6ab504-kube-api-access-mn899\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.066275 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.066203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e691a15a-3026-498d-ae2e-6fd06b6ab504-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.066275 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.066239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e691a15a-3026-498d-ae2e-6fd06b6ab504-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.066482 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.066340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e691a15a-3026-498d-ae2e-6fd06b6ab504-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.066482 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.066379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn899\" (UniqueName: \"kubernetes.io/projected/e691a15a-3026-498d-ae2e-6fd06b6ab504-kube-api-access-mn899\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.066646 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.066614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e691a15a-3026-498d-ae2e-6fd06b6ab504-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.066922 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.066899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e691a15a-3026-498d-ae2e-6fd06b6ab504-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.068953 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.068930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e691a15a-3026-498d-ae2e-6fd06b6ab504-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.074609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.074586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn899\" (UniqueName: \"kubernetes.io/projected/e691a15a-3026-498d-ae2e-6fd06b6ab504-kube-api-access-mn899\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.177279 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.177253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:31.293160 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.292964 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj"] Apr 24 21:50:31.295617 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:50:31.295591 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode691a15a_3026_498d_ae2e_6fd06b6ab504.slice/crio-a24dd892e07a3d1f5ab964d1c1e6d41df5d2e69b2c153cf5edef8b5fbd0502f4 WatchSource:0}: Error finding container a24dd892e07a3d1f5ab964d1c1e6d41df5d2e69b2c153cf5edef8b5fbd0502f4: Status 404 returned error can't find the container with id a24dd892e07a3d1f5ab964d1c1e6d41df5d2e69b2c153cf5edef8b5fbd0502f4 Apr 24 21:50:31.847344 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.847288 2571 generic.go:358] "Generic (PLEG): container finished" podID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerID="9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301" exitCode=2 Apr 24 21:50:31.847344 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.847331 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerDied","Data":"9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301"} Apr 24 21:50:31.848530 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.848508 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerStarted","Data":"5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d"} Apr 24 21:50:31.848642 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:31.848535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerStarted","Data":"a24dd892e07a3d1f5ab964d1c1e6d41df5d2e69b2c153cf5edef8b5fbd0502f4"} Apr 24 21:50:32.695268 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:32.695229 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 24 21:50:32.699511 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:32.699492 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:50:33.317362 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.317339 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:50:33.487007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.486910 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dadf643-d1ec-47e8-9b32-9e946d424d06-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"7dadf643-d1ec-47e8-9b32-9e946d424d06\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " Apr 24 21:50:33.487007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.486959 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dadf643-d1ec-47e8-9b32-9e946d424d06-proxy-tls\") pod \"7dadf643-d1ec-47e8-9b32-9e946d424d06\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " Apr 24 21:50:33.487007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.486977 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dadf643-d1ec-47e8-9b32-9e946d424d06-kserve-provision-location\") pod \"7dadf643-d1ec-47e8-9b32-9e946d424d06\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " Apr 24 21:50:33.487007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.486999 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2slm5\" (UniqueName: \"kubernetes.io/projected/7dadf643-d1ec-47e8-9b32-9e946d424d06-kube-api-access-2slm5\") pod \"7dadf643-d1ec-47e8-9b32-9e946d424d06\" (UID: \"7dadf643-d1ec-47e8-9b32-9e946d424d06\") " Apr 24 21:50:33.487376 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.487350 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dadf643-d1ec-47e8-9b32-9e946d424d06-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "7dadf643-d1ec-47e8-9b32-9e946d424d06" (UID: "7dadf643-d1ec-47e8-9b32-9e946d424d06"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:33.489217 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.489191 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dadf643-d1ec-47e8-9b32-9e946d424d06-kube-api-access-2slm5" (OuterVolumeSpecName: "kube-api-access-2slm5") pod "7dadf643-d1ec-47e8-9b32-9e946d424d06" (UID: "7dadf643-d1ec-47e8-9b32-9e946d424d06"). InnerVolumeSpecName "kube-api-access-2slm5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:33.489217 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.489200 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dadf643-d1ec-47e8-9b32-9e946d424d06-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7dadf643-d1ec-47e8-9b32-9e946d424d06" (UID: "7dadf643-d1ec-47e8-9b32-9e946d424d06"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:33.496580 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.496555 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dadf643-d1ec-47e8-9b32-9e946d424d06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7dadf643-d1ec-47e8-9b32-9e946d424d06" (UID: "7dadf643-d1ec-47e8-9b32-9e946d424d06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:33.587946 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.587918 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dadf643-d1ec-47e8-9b32-9e946d424d06-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:50:33.587946 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.587944 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dadf643-d1ec-47e8-9b32-9e946d424d06-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:50:33.588105 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.587955 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dadf643-d1ec-47e8-9b32-9e946d424d06-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:50:33.588105 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.587964 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2slm5\" (UniqueName: \"kubernetes.io/projected/7dadf643-d1ec-47e8-9b32-9e946d424d06-kube-api-access-2slm5\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:50:33.855327 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.855276 2571 generic.go:358] "Generic (PLEG): container finished" podID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerID="e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb" exitCode=0 Apr 24 21:50:33.855504 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.855342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerDied","Data":"e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb"} Apr 24 21:50:33.855504 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.855372 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" Apr 24 21:50:33.855504 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.855388 2571 scope.go:117] "RemoveContainer" containerID="9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301" Apr 24 21:50:33.855504 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.855374 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5" event={"ID":"7dadf643-d1ec-47e8-9b32-9e946d424d06","Type":"ContainerDied","Data":"8d0a61570bc22bf97b1ba4a78e4cdb3cabf67b7319bbcf4788fac7ba21552bf9"} Apr 24 21:50:33.862985 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.862962 2571 scope.go:117] "RemoveContainer" containerID="e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb" Apr 24 21:50:33.869449 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.869434 2571 scope.go:117] "RemoveContainer" containerID="83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c" Apr 24 21:50:33.874627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.874606 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5"] Apr 24 21:50:33.876224 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.876183 2571 scope.go:117] "RemoveContainer" containerID="9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301" Apr 24 21:50:33.876514 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:50:33.876475 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301\": container with ID starting with 9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301 not found: ID does not exist" containerID="9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301" Apr 24 21:50:33.876647 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.876530 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301"} err="failed to get container status \"9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301\": rpc error: code = NotFound desc = could not find container \"9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301\": container with ID starting with 9fd26adc48dd814e57520c5a85b791fcce18bfd7667089a2d0769b083bd93301 not found: ID does not exist" Apr 24 21:50:33.876647 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.876552 2571 scope.go:117] "RemoveContainer" containerID="e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb" Apr 24 21:50:33.877007 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:50:33.876904 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb\": container with ID starting with e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb not found: ID does not exist" containerID="e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb" Apr 24 21:50:33.877007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.876952 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb"} err="failed to get container status \"e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb\": rpc error: code = NotFound desc = could not find container \"e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb\": container with ID starting with e15e6aaaf798c817acbe15f7a8b7836c5680cbbef972ecf41da12905fab787eb not found: ID does not exist" Apr 24 21:50:33.877007 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.876973 2571 scope.go:117] "RemoveContainer" containerID="83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c" Apr 24 21:50:33.877248 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:50:33.877231 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c\": container with ID starting with 83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c not found: ID does not exist" containerID="83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c" Apr 24 21:50:33.877290 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.877255 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c"} err="failed to get container status \"83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c\": rpc error: code = NotFound desc = could not find container \"83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c\": container with ID starting with 83f7435013da234e37b48eb3f1166956caad21239d0e6e9ad38249a48c31b34c not found: ID does not exist" Apr 24 21:50:33.878668 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:33.878652 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-wd9c5"] Apr 24 21:50:35.648098 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:35.648070 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" path="/var/lib/kubelet/pods/7dadf643-d1ec-47e8-9b32-9e946d424d06/volumes" Apr 24 21:50:35.863209 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:35.863130 2571 generic.go:358] "Generic (PLEG): container finished" podID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerID="5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d" exitCode=0 Apr 24 21:50:35.863209 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:35.863197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerDied","Data":"5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d"} Apr 24 21:50:36.868253 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:36.868220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerStarted","Data":"1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19"} Apr 24 21:50:36.868614 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:36.868262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerStarted","Data":"a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549"} Apr 24 21:50:36.868614 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:36.868489 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:36.887612 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:36.887566 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podStartSLOduration=6.887550952 podStartE2EDuration="6.887550952s" podCreationTimestamp="2026-04-24 21:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:36.886864773 +0000 UTC m=+1437.734908903" watchObservedRunningTime="2026-04-24 21:50:36.887550952 +0000 UTC m=+1437.735595068" Apr 24 21:50:37.871452 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:37.871417 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:37.872461 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:37.872437 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:50:38.873982 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:38.873945 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:50:43.878000 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:43.877972 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:50:43.878570 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:43.878544 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:50:53.878956 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:50:53.878912 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:51:03.879071 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:03.879031 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:51:13.878600 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:13.878559 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:51:23.879335 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:23.879284 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:51:32.264142 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.264107 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj"] Apr 24 21:51:32.264552 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.264511 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" containerID="cri-o://a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549" gracePeriod=30 Apr 24 21:51:32.264619 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.264570 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kube-rbac-proxy" containerID="cri-o://1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19" gracePeriod=30 Apr 24 21:51:32.382705 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.382679 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5"] Apr 24 21:51:32.382955 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.382944 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="storage-initializer" Apr 24 21:51:32.382995 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.382957 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="storage-initializer" Apr 24 21:51:32.382995 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.382970 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kube-rbac-proxy" Apr 24 21:51:32.382995 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.382975 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kube-rbac-proxy" Apr 24 21:51:32.382995 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.382988 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" Apr 24 21:51:32.382995 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.382993 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" Apr 24 21:51:32.383176 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.383035 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kserve-container" Apr 24 21:51:32.383176 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.383045 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dadf643-d1ec-47e8-9b32-9e946d424d06" containerName="kube-rbac-proxy" Apr 24 21:51:32.385806 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.385789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.389173 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.389154 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 24 21:51:32.389907 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.389891 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:51:32.405456 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.405433 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5"] Apr 24 21:51:32.525797 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.525721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/255efb22-913e-457a-ba69-c563dbffdf29-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.525928 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.525804 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/255efb22-913e-457a-ba69-c563dbffdf29-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.525928 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.525839 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnqt\" (UniqueName: \"kubernetes.io/projected/255efb22-913e-457a-ba69-c563dbffdf29-kube-api-access-kqnqt\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.525928 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.525863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/255efb22-913e-457a-ba69-c563dbffdf29-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.626542 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.626512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnqt\" (UniqueName: \"kubernetes.io/projected/255efb22-913e-457a-ba69-c563dbffdf29-kube-api-access-kqnqt\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.626542 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.626549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/255efb22-913e-457a-ba69-c563dbffdf29-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.626778 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.626622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/255efb22-913e-457a-ba69-c563dbffdf29-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.626778 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.626761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/255efb22-913e-457a-ba69-c563dbffdf29-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.627039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.627016 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/255efb22-913e-457a-ba69-c563dbffdf29-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.627291 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.627269 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/255efb22-913e-457a-ba69-c563dbffdf29-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.629026 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.629007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/255efb22-913e-457a-ba69-c563dbffdf29-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.635221 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.635199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnqt\" (UniqueName: \"kubernetes.io/projected/255efb22-913e-457a-ba69-c563dbffdf29-kube-api-access-kqnqt\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.695277 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.695253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:32.810756 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.810733 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5"] Apr 24 21:51:32.812734 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:51:32.812712 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255efb22_913e_457a_ba69_c563dbffdf29.slice/crio-2cf0a2ce1331b19a0ae68d6619f95ebfc248dfd13a1019d5ad9543fac4fc71d8 WatchSource:0}: Error finding container 2cf0a2ce1331b19a0ae68d6619f95ebfc248dfd13a1019d5ad9543fac4fc71d8: Status 404 returned error can't find the container with id 2cf0a2ce1331b19a0ae68d6619f95ebfc248dfd13a1019d5ad9543fac4fc71d8 Apr 24 21:51:32.814381 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:32.814361 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:51:33.020006 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:33.019975 2571 generic.go:358] "Generic (PLEG): container finished" podID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerID="1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19" exitCode=2 Apr 24 21:51:33.020178 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:33.020041 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerDied","Data":"1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19"} Apr 24 21:51:33.021347 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:33.021323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerStarted","Data":"fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092"} Apr 24 21:51:33.021454 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:33.021355 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerStarted","Data":"2cf0a2ce1331b19a0ae68d6619f95ebfc248dfd13a1019d5ad9543fac4fc71d8"} Apr 24 21:51:33.874337 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:33.874279 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 21:51:33.878640 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:33.878614 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:51:34.902124 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:34.902101 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:51:35.028113 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.028080 2571 generic.go:358] "Generic (PLEG): container finished" podID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerID="a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549" exitCode=0 Apr 24 21:51:35.028269 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.028171 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" Apr 24 21:51:35.028269 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.028165 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerDied","Data":"a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549"} Apr 24 21:51:35.028358 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.028274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj" event={"ID":"e691a15a-3026-498d-ae2e-6fd06b6ab504","Type":"ContainerDied","Data":"a24dd892e07a3d1f5ab964d1c1e6d41df5d2e69b2c153cf5edef8b5fbd0502f4"} Apr 24 21:51:35.028358 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.028289 2571 scope.go:117] "RemoveContainer" containerID="1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19" Apr 24 21:51:35.035541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.035526 2571 scope.go:117] "RemoveContainer" containerID="a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549" Apr 24 21:51:35.042179 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.042165 2571 scope.go:117] "RemoveContainer" containerID="5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d" Apr 24 21:51:35.045132 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.045116 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn899\" (UniqueName: \"kubernetes.io/projected/e691a15a-3026-498d-ae2e-6fd06b6ab504-kube-api-access-mn899\") pod \"e691a15a-3026-498d-ae2e-6fd06b6ab504\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " Apr 24 21:51:35.045210 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.045159 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e691a15a-3026-498d-ae2e-6fd06b6ab504-kserve-provision-location\") pod \"e691a15a-3026-498d-ae2e-6fd06b6ab504\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " Apr 24 21:51:35.045210 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.045207 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e691a15a-3026-498d-ae2e-6fd06b6ab504-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"e691a15a-3026-498d-ae2e-6fd06b6ab504\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " Apr 24 21:51:35.045281 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.045232 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e691a15a-3026-498d-ae2e-6fd06b6ab504-proxy-tls\") pod \"e691a15a-3026-498d-ae2e-6fd06b6ab504\" (UID: \"e691a15a-3026-498d-ae2e-6fd06b6ab504\") " Apr 24 21:51:35.045590 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.045560 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e691a15a-3026-498d-ae2e-6fd06b6ab504-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "e691a15a-3026-498d-ae2e-6fd06b6ab504" (UID: "e691a15a-3026-498d-ae2e-6fd06b6ab504"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:51:35.047265 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.047244 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e691a15a-3026-498d-ae2e-6fd06b6ab504-kube-api-access-mn899" (OuterVolumeSpecName: "kube-api-access-mn899") pod "e691a15a-3026-498d-ae2e-6fd06b6ab504" (UID: "e691a15a-3026-498d-ae2e-6fd06b6ab504"). InnerVolumeSpecName "kube-api-access-mn899". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:51:35.047433 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.047415 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e691a15a-3026-498d-ae2e-6fd06b6ab504-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e691a15a-3026-498d-ae2e-6fd06b6ab504" (UID: "e691a15a-3026-498d-ae2e-6fd06b6ab504"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:51:35.049912 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.049897 2571 scope.go:117] "RemoveContainer" containerID="1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19" Apr 24 21:51:35.050165 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:51:35.050149 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19\": container with ID starting with 1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19 not found: ID does not exist" containerID="1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19" Apr 24 21:51:35.050221 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.050171 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19"} err="failed to get container status \"1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19\": rpc error: code = NotFound desc = could not find container \"1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19\": container with ID starting with 1ae5f00038bf2faf067676785ffc0662353f8790cf13a452dd9cbbf3fa47af19 not found: ID does not exist" Apr 24 21:51:35.050221 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.050185 2571 scope.go:117] "RemoveContainer" containerID="a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549" Apr 24 21:51:35.050442 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:51:35.050417 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549\": container with ID starting with a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549 not found: ID does not exist" containerID="a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549" Apr 24 21:51:35.050503 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.050454 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549"} err="failed to get container status \"a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549\": rpc error: code = NotFound desc = could not find container \"a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549\": container with ID starting with a9b7e7e153266248dfabbe2463f816b1b38a5828bb866e582a7c8bb4df457549 not found: ID does not exist" Apr 24 21:51:35.050503 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.050476 2571 scope.go:117] "RemoveContainer" containerID="5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d" Apr 24 21:51:35.050713 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:51:35.050698 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d\": container with ID starting with 5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d not found: ID does not exist" containerID="5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d" Apr 24 21:51:35.050765 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.050718 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d"} err="failed to get container status \"5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d\": rpc error: code = NotFound desc = could not find container \"5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d\": container with ID starting with 5763e91fe2301081e950dd35583e71078670ffbfa8125f59c1e8ea9b7c76472d not found: ID does not exist" Apr 24 21:51:35.055734 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.055714 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e691a15a-3026-498d-ae2e-6fd06b6ab504-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e691a15a-3026-498d-ae2e-6fd06b6ab504" (UID: "e691a15a-3026-498d-ae2e-6fd06b6ab504"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:51:35.145682 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.145661 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mn899\" (UniqueName: \"kubernetes.io/projected/e691a15a-3026-498d-ae2e-6fd06b6ab504-kube-api-access-mn899\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:51:35.145682 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.145684 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e691a15a-3026-498d-ae2e-6fd06b6ab504-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:51:35.145802 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.145694 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e691a15a-3026-498d-ae2e-6fd06b6ab504-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:51:35.145802 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.145704 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e691a15a-3026-498d-ae2e-6fd06b6ab504-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:51:35.359488 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.359456 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj"] Apr 24 21:51:35.364663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.364624 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrkjj"] Apr 24 21:51:35.651937 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:35.648569 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" path="/var/lib/kubelet/pods/e691a15a-3026-498d-ae2e-6fd06b6ab504/volumes" Apr 24 21:51:38.037150 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:38.037117 2571 generic.go:358] "Generic (PLEG): container finished" podID="255efb22-913e-457a-ba69-c563dbffdf29" containerID="fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092" exitCode=0 Apr 24 21:51:38.037541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:38.037192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerDied","Data":"fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092"} Apr 24 21:51:39.041737 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:39.041705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerStarted","Data":"ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444"} Apr 24 21:51:39.042119 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:39.041746 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerStarted","Data":"ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743"} Apr 24 21:51:39.042119 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:39.041969 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:39.062609 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:39.062563 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podStartSLOduration=7.062549012 podStartE2EDuration="7.062549012s" podCreationTimestamp="2026-04-24 21:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:51:39.061473195 +0000 UTC m=+1499.909517309" watchObservedRunningTime="2026-04-24 21:51:39.062549012 +0000 UTC m=+1499.910593128" Apr 24 21:51:40.044354 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:40.044327 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:40.045541 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:40.045514 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:51:41.047049 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:41.047007 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:51:46.051197 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:46.051169 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:51:46.051675 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:46.051638 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:51:56.052396 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:51:56.052358 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:52:06.052388 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:06.052281 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:52:16.052463 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:16.052423 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:52:26.052358 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:26.052330 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:52:33.893206 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.893164 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5"] Apr 24 21:52:33.893717 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.893633 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" containerID="cri-o://ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743" gracePeriod=30 Apr 24 21:52:33.893783 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.893698 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kube-rbac-proxy" containerID="cri-o://ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444" gracePeriod=30 Apr 24 21:52:33.991256 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991225 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw"] Apr 24 21:52:33.991528 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991516 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="storage-initializer" Apr 24 21:52:33.991578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991530 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="storage-initializer" Apr 24 21:52:33.991578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991546 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" Apr 24 21:52:33.991578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991551 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" Apr 24 21:52:33.991578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991562 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kube-rbac-proxy" Apr 24 21:52:33.991578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991567 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kube-rbac-proxy" Apr 24 21:52:33.991772 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991611 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kube-rbac-proxy" Apr 24 21:52:33.991772 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.991621 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e691a15a-3026-498d-ae2e-6fd06b6ab504" containerName="kserve-container" Apr 24 21:52:33.994642 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.994625 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:33.997218 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.997197 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 24 21:52:33.997343 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:33.997205 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 24 21:52:34.004682 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.004661 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw"] Apr 24 21:52:34.093671 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.093643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.093810 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.093685 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.093810 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.093729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxt2t\" (UniqueName: \"kubernetes.io/projected/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kube-api-access-dxt2t\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.093810 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.093750 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.193070 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.192992 2571 generic.go:358] "Generic (PLEG): container finished" podID="255efb22-913e-457a-ba69-c563dbffdf29" containerID="ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444" exitCode=2 Apr 24 21:52:34.193195 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.193068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerDied","Data":"ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444"} Apr 24 21:52:34.194286 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.194269 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.194363 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.194335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.194363 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.194360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxt2t\" (UniqueName: \"kubernetes.io/projected/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kube-api-access-dxt2t\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.194426 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.194377 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.194681 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.194660 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.194959 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.194929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.196817 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.196799 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.203788 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.203769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxt2t\" (UniqueName: \"kubernetes.io/projected/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kube-api-access-dxt2t\") pod \"isvc-pmml-predictor-8bb578669-qq7sw\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.306574 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.306544 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:34.426622 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:34.426492 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw"] Apr 24 21:52:34.429256 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:52:34.429230 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66f2dc9a_eabf_497c_b681_56f9c6acd3ac.slice/crio-864d6bb7ca06a5e110a68a7d8e23596362e1df09414a2caf1508a9139652f150 WatchSource:0}: Error finding container 864d6bb7ca06a5e110a68a7d8e23596362e1df09414a2caf1508a9139652f150: Status 404 returned error can't find the container with id 864d6bb7ca06a5e110a68a7d8e23596362e1df09414a2caf1508a9139652f150 Apr 24 21:52:35.196486 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:35.196444 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerStarted","Data":"377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452"} Apr 24 21:52:35.196486 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:35.196486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerStarted","Data":"864d6bb7ca06a5e110a68a7d8e23596362e1df09414a2caf1508a9139652f150"} Apr 24 21:52:36.047944 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.047903 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:52:36.052247 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.052227 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:52:36.522090 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.522068 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:52:36.614240 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.614156 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/255efb22-913e-457a-ba69-c563dbffdf29-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"255efb22-913e-457a-ba69-c563dbffdf29\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " Apr 24 21:52:36.614240 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.614198 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/255efb22-913e-457a-ba69-c563dbffdf29-kserve-provision-location\") pod \"255efb22-913e-457a-ba69-c563dbffdf29\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " Apr 24 21:52:36.614492 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.614255 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnqt\" (UniqueName: \"kubernetes.io/projected/255efb22-913e-457a-ba69-c563dbffdf29-kube-api-access-kqnqt\") pod \"255efb22-913e-457a-ba69-c563dbffdf29\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " Apr 24 21:52:36.614492 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.614347 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/255efb22-913e-457a-ba69-c563dbffdf29-proxy-tls\") pod \"255efb22-913e-457a-ba69-c563dbffdf29\" (UID: \"255efb22-913e-457a-ba69-c563dbffdf29\") " Apr 24 21:52:36.614626 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.614602 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/255efb22-913e-457a-ba69-c563dbffdf29-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "255efb22-913e-457a-ba69-c563dbffdf29" (UID: "255efb22-913e-457a-ba69-c563dbffdf29"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:52:36.616475 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.616453 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255efb22-913e-457a-ba69-c563dbffdf29-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "255efb22-913e-457a-ba69-c563dbffdf29" (UID: "255efb22-913e-457a-ba69-c563dbffdf29"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:36.616566 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.616543 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255efb22-913e-457a-ba69-c563dbffdf29-kube-api-access-kqnqt" (OuterVolumeSpecName: "kube-api-access-kqnqt") pod "255efb22-913e-457a-ba69-c563dbffdf29" (UID: "255efb22-913e-457a-ba69-c563dbffdf29"). InnerVolumeSpecName "kube-api-access-kqnqt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:36.624348 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.624325 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255efb22-913e-457a-ba69-c563dbffdf29-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "255efb22-913e-457a-ba69-c563dbffdf29" (UID: "255efb22-913e-457a-ba69-c563dbffdf29"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:36.715039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.715005 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqnqt\" (UniqueName: \"kubernetes.io/projected/255efb22-913e-457a-ba69-c563dbffdf29-kube-api-access-kqnqt\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:52:36.715039 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.715037 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/255efb22-913e-457a-ba69-c563dbffdf29-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:52:36.715215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.715052 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/255efb22-913e-457a-ba69-c563dbffdf29-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:52:36.715215 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:36.715065 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/255efb22-913e-457a-ba69-c563dbffdf29-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:52:37.203150 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.203113 2571 generic.go:358] "Generic (PLEG): container finished" podID="255efb22-913e-457a-ba69-c563dbffdf29" containerID="ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743" exitCode=0 Apr 24 21:52:37.203318 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.203153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerDied","Data":"ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743"} Apr 24 21:52:37.203318 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.203195 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" event={"ID":"255efb22-913e-457a-ba69-c563dbffdf29","Type":"ContainerDied","Data":"2cf0a2ce1331b19a0ae68d6619f95ebfc248dfd13a1019d5ad9543fac4fc71d8"} Apr 24 21:52:37.203318 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.203194 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5" Apr 24 21:52:37.203318 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.203263 2571 scope.go:117] "RemoveContainer" containerID="ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444" Apr 24 21:52:37.211361 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.211342 2571 scope.go:117] "RemoveContainer" containerID="ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743" Apr 24 21:52:37.222279 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.222252 2571 scope.go:117] "RemoveContainer" containerID="fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092" Apr 24 21:52:37.225998 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.225974 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5"] Apr 24 21:52:37.229856 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.229842 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-ls4w5"] Apr 24 21:52:37.229912 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.229895 2571 scope.go:117] "RemoveContainer" containerID="ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444" Apr 24 21:52:37.230156 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:52:37.230133 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444\": container with ID starting with ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444 not found: ID does not exist" containerID="ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444" Apr 24 21:52:37.230245 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.230161 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444"} err="failed to get container status \"ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444\": rpc error: code = NotFound desc = could not find container \"ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444\": container with ID starting with ff7c6016acb894f38b90da0bf7c832835d5511a8adb502263597c553129f9444 not found: ID does not exist" Apr 24 21:52:37.230245 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.230177 2571 scope.go:117] "RemoveContainer" containerID="ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743" Apr 24 21:52:37.230426 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:52:37.230407 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743\": container with ID starting with ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743 not found: ID does not exist" containerID="ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743" Apr 24 21:52:37.230485 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.230434 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743"} err="failed to get container status \"ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743\": rpc error: code = NotFound desc = could not find container \"ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743\": container with ID starting with ea2a469dcef87cf16b8367177b5b8a2f3f5549e150a172b0de415bc2c3e75743 not found: ID does not exist" Apr 24 21:52:37.230485 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.230456 2571 scope.go:117] "RemoveContainer" containerID="fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092" Apr 24 21:52:37.230673 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:52:37.230654 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092\": container with ID starting with fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092 not found: ID does not exist" containerID="fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092" Apr 24 21:52:37.230710 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.230679 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092"} err="failed to get container status \"fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092\": rpc error: code = NotFound desc = could not find container \"fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092\": container with ID starting with fdb079f92abb2a36669a2cbc6fddcf409677dc9ff0b6659e217c70f28355e092 not found: ID does not exist" Apr 24 21:52:37.647654 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:37.647623 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255efb22-913e-457a-ba69-c563dbffdf29" path="/var/lib/kubelet/pods/255efb22-913e-457a-ba69-c563dbffdf29/volumes" Apr 24 21:52:38.208155 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:38.208126 2571 generic.go:358] "Generic (PLEG): container finished" podID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerID="377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452" exitCode=0 Apr 24 21:52:38.208279 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:38.208207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerDied","Data":"377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452"} Apr 24 21:52:45.231820 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:45.231739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerStarted","Data":"0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269"} Apr 24 21:52:45.231820 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:45.231789 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerStarted","Data":"e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7"} Apr 24 21:52:45.232191 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:45.231997 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:45.251461 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:45.251411 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podStartSLOduration=5.640780809 podStartE2EDuration="12.251397585s" podCreationTimestamp="2026-04-24 21:52:33 +0000 UTC" firstStartedPulling="2026-04-24 21:52:38.209356837 +0000 UTC m=+1559.057400932" lastFinishedPulling="2026-04-24 21:52:44.819973601 +0000 UTC m=+1565.668017708" observedRunningTime="2026-04-24 21:52:45.24986789 +0000 UTC m=+1566.097912005" watchObservedRunningTime="2026-04-24 21:52:45.251397585 +0000 UTC m=+1566.099441700" Apr 24 21:52:46.235216 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:46.235186 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:46.236230 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:46.236208 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:52:47.238413 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:47.238373 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:52:52.242555 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:52.242526 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:52:52.243099 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:52:52.243072 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:53:02.243274 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:53:02.243233 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:53:12.243876 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:53:12.243829 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:53:22.243901 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:53:22.243866 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:53:32.243231 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:53:32.243145 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:53:42.243221 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:53:42.243181 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:53:52.243194 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:53:52.243155 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:54:02.243477 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:02.243446 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:54:05.115461 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.115430 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw"] Apr 24 21:54:05.115849 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.115731 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" containerID="cri-o://e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7" gracePeriod=30 Apr 24 21:54:05.115849 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.115779 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kube-rbac-proxy" containerID="cri-o://0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269" gracePeriod=30 Apr 24 21:54:05.229558 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.229524 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss"] Apr 24 21:54:05.230010 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.229991 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="storage-initializer" Apr 24 21:54:05.230058 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.230016 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="storage-initializer" Apr 24 21:54:05.230058 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.230034 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kube-rbac-proxy" Apr 24 21:54:05.230058 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.230044 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kube-rbac-proxy" Apr 24 21:54:05.230175 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.230057 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" Apr 24 21:54:05.230175 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.230067 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" Apr 24 21:54:05.230175 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.230164 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kserve-container" Apr 24 21:54:05.230271 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.230179 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="255efb22-913e-457a-ba69-c563dbffdf29" containerName="kube-rbac-proxy" Apr 24 21:54:05.234100 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.234073 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.236505 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.236478 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:54:05.236825 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.236790 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 24 21:54:05.243066 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.243019 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss"] Apr 24 21:54:05.304024 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.303991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crbjw\" (UniqueName: \"kubernetes.io/projected/c4229560-064c-4ecc-a4c0-a089494f96e8-kube-api-access-crbjw\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.304024 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.304029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4229560-064c-4ecc-a4c0-a089494f96e8-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.304227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.304052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4229560-064c-4ecc-a4c0-a089494f96e8-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.304227 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.304136 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4229560-064c-4ecc-a4c0-a089494f96e8-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.404681 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.404603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crbjw\" (UniqueName: \"kubernetes.io/projected/c4229560-064c-4ecc-a4c0-a089494f96e8-kube-api-access-crbjw\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.404681 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.404645 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4229560-064c-4ecc-a4c0-a089494f96e8-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.404681 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.404671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4229560-064c-4ecc-a4c0-a089494f96e8-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.404967 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.404711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4229560-064c-4ecc-a4c0-a089494f96e8-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.405156 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.405134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4229560-064c-4ecc-a4c0-a089494f96e8-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.405390 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.405359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4229560-064c-4ecc-a4c0-a089494f96e8-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.407546 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.407522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4229560-064c-4ecc-a4c0-a089494f96e8-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.412956 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.412933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crbjw\" (UniqueName: \"kubernetes.io/projected/c4229560-064c-4ecc-a4c0-a089494f96e8-kube-api-access-crbjw\") pod \"isvc-pmml-runtime-predictor-67bc544947-kz8ss\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.444619 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.444581 2571 generic.go:358] "Generic (PLEG): container finished" podID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerID="0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269" exitCode=2 Apr 24 21:54:05.444748 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.444646 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerDied","Data":"0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269"} Apr 24 21:54:05.547917 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.547879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:05.665892 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:05.665820 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss"] Apr 24 21:54:05.669186 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:54:05.669163 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4229560_064c_4ecc_a4c0_a089494f96e8.slice/crio-0cbd27afbc61ddfe928d179fde7da198036242cbe42096cd486c2120ad52acd5 WatchSource:0}: Error finding container 0cbd27afbc61ddfe928d179fde7da198036242cbe42096cd486c2120ad52acd5: Status 404 returned error can't find the container with id 0cbd27afbc61ddfe928d179fde7da198036242cbe42096cd486c2120ad52acd5 Apr 24 21:54:06.448899 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:06.448851 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerStarted","Data":"955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172"} Apr 24 21:54:06.448899 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:06.448899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerStarted","Data":"0cbd27afbc61ddfe928d179fde7da198036242cbe42096cd486c2120ad52acd5"} Apr 24 21:54:07.239067 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:07.239021 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:54:08.646082 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.646058 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:54:08.731263 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.731183 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-proxy-tls\") pod \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " Apr 24 21:54:08.731263 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.731246 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxt2t\" (UniqueName: \"kubernetes.io/projected/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kube-api-access-dxt2t\") pod \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " Apr 24 21:54:08.731512 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.731290 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kserve-provision-location\") pod \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " Apr 24 21:54:08.731512 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.731365 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\" (UID: \"66f2dc9a-eabf-497c-b681-56f9c6acd3ac\") " Apr 24 21:54:08.731670 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.731643 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "66f2dc9a-eabf-497c-b681-56f9c6acd3ac" (UID: "66f2dc9a-eabf-497c-b681-56f9c6acd3ac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:08.731781 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.731709 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "66f2dc9a-eabf-497c-b681-56f9c6acd3ac" (UID: "66f2dc9a-eabf-497c-b681-56f9c6acd3ac"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:54:08.733505 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.733485 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "66f2dc9a-eabf-497c-b681-56f9c6acd3ac" (UID: "66f2dc9a-eabf-497c-b681-56f9c6acd3ac"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:54:08.733596 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.733535 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kube-api-access-dxt2t" (OuterVolumeSpecName: "kube-api-access-dxt2t") pod "66f2dc9a-eabf-497c-b681-56f9c6acd3ac" (UID: "66f2dc9a-eabf-497c-b681-56f9c6acd3ac"). InnerVolumeSpecName "kube-api-access-dxt2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:54:08.832858 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.832821 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dxt2t\" (UniqueName: \"kubernetes.io/projected/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kube-api-access-dxt2t\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:54:08.832858 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.832852 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:54:08.832858 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.832863 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:54:08.833074 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:08.832873 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66f2dc9a-eabf-497c-b681-56f9c6acd3ac-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:54:09.457705 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.457669 2571 generic.go:358] "Generic (PLEG): container finished" podID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerID="e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7" exitCode=0 Apr 24 21:54:09.457874 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.457746 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" Apr 24 21:54:09.457874 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.457752 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerDied","Data":"e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7"} Apr 24 21:54:09.457874 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.457796 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw" event={"ID":"66f2dc9a-eabf-497c-b681-56f9c6acd3ac","Type":"ContainerDied","Data":"864d6bb7ca06a5e110a68a7d8e23596362e1df09414a2caf1508a9139652f150"} Apr 24 21:54:09.457874 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.457812 2571 scope.go:117] "RemoveContainer" containerID="0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269" Apr 24 21:54:09.465801 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.465779 2571 scope.go:117] "RemoveContainer" containerID="e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7" Apr 24 21:54:09.480149 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.480129 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw"] Apr 24 21:54:09.484049 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.484028 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-qq7sw"] Apr 24 21:54:09.515571 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.515551 2571 scope.go:117] "RemoveContainer" containerID="377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452" Apr 24 21:54:09.522242 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.522225 2571 scope.go:117] "RemoveContainer" containerID="0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269" Apr 24 21:54:09.522500 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:54:09.522482 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269\": container with ID starting with 0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269 not found: ID does not exist" containerID="0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269" Apr 24 21:54:09.522555 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.522510 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269"} err="failed to get container status \"0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269\": rpc error: code = NotFound desc = could not find container \"0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269\": container with ID starting with 0738722795bec531b93077596026ef6e421cbf72f35c678e220f83a0f1033269 not found: ID does not exist" Apr 24 21:54:09.522555 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.522529 2571 scope.go:117] "RemoveContainer" containerID="e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7" Apr 24 21:54:09.522761 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:54:09.522745 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7\": container with ID starting with e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7 not found: ID does not exist" containerID="e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7" Apr 24 21:54:09.522796 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.522770 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7"} err="failed to get container status \"e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7\": rpc error: code = NotFound desc = could not find container \"e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7\": container with ID starting with e6e0af9a905ba54a680055dab704fcde85d2308ab8fadef5894b994aa58c2fc7 not found: ID does not exist" Apr 24 21:54:09.522796 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.522786 2571 scope.go:117] "RemoveContainer" containerID="377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452" Apr 24 21:54:09.523021 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:54:09.523004 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452\": container with ID starting with 377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452 not found: ID does not exist" containerID="377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452" Apr 24 21:54:09.523064 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.523026 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452"} err="failed to get container status \"377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452\": rpc error: code = NotFound desc = could not find container \"377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452\": container with ID starting with 377e00d8dfa07cd58976db40274985384f7719b7618d8c8df7c3eb34b42ae452 not found: ID does not exist" Apr 24 21:54:09.647991 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:09.647956 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" path="/var/lib/kubelet/pods/66f2dc9a-eabf-497c-b681-56f9c6acd3ac/volumes" Apr 24 21:54:10.462983 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:10.462948 2571 generic.go:358] "Generic (PLEG): container finished" podID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerID="955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172" exitCode=0 Apr 24 21:54:10.463146 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:10.463026 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerDied","Data":"955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172"} Apr 24 21:54:11.468234 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:11.468186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerStarted","Data":"3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4"} Apr 24 21:54:11.468234 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:11.468237 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerStarted","Data":"a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82"} Apr 24 21:54:11.468653 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:11.468457 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:11.492237 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:11.492161 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podStartSLOduration=6.492147905 podStartE2EDuration="6.492147905s" podCreationTimestamp="2026-04-24 21:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:11.489691755 +0000 UTC m=+1652.337735893" watchObservedRunningTime="2026-04-24 21:54:11.492147905 +0000 UTC m=+1652.340192019" Apr 24 21:54:12.470966 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:12.470941 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:12.472136 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:12.472110 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:54:13.474307 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:13.474264 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:54:18.478488 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:18.478459 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:54:18.479022 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:18.478995 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:54:28.479369 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:28.479326 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:54:38.479481 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:38.479444 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:54:48.479264 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:48.479227 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:54:58.479065 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:54:58.478984 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:55:08.479449 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:08.479410 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:55:18.479018 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:18.478977 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:55:22.644079 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:22.644044 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:55:32.644440 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:32.644412 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:55:36.356338 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.356275 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss"] Apr 24 21:55:36.356714 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.356615 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" containerID="cri-o://a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82" gracePeriod=30 Apr 24 21:55:36.356714 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.356641 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kube-rbac-proxy" containerID="cri-o://3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4" gracePeriod=30 Apr 24 21:55:36.494464 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.494429 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4"] Apr 24 21:55:36.494875 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.494859 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="storage-initializer" Apr 24 21:55:36.494925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.494879 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="storage-initializer" Apr 24 21:55:36.494925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.494892 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" Apr 24 21:55:36.494925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.494900 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" Apr 24 21:55:36.494925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.494912 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kube-rbac-proxy" Apr 24 21:55:36.494925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.494919 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kube-rbac-proxy" Apr 24 21:55:36.495082 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.495001 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kserve-container" Apr 24 21:55:36.495082 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.495012 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="66f2dc9a-eabf-497c-b681-56f9c6acd3ac" containerName="kube-rbac-proxy" Apr 24 21:55:36.498037 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.498020 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.502576 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.502553 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:55:36.502670 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.502585 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 24 21:55:36.512872 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.512831 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4"] Apr 24 21:55:36.621532 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.621454 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/37d74c42-5045-4242-88ce-2915433b82d4-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.621532 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.621496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.621707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.621561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/37d74c42-5045-4242-88ce-2915433b82d4-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.621707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.621623 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6j7\" (UniqueName: \"kubernetes.io/projected/37d74c42-5045-4242-88ce-2915433b82d4-kube-api-access-wb6j7\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.702890 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.702858 2571 generic.go:358] "Generic (PLEG): container finished" podID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerID="3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4" exitCode=2 Apr 24 21:55:36.703052 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.702930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerDied","Data":"3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4"} Apr 24 21:55:36.722742 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.722715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.722854 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.722756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/37d74c42-5045-4242-88ce-2915433b82d4-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.722854 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.722817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6j7\" (UniqueName: \"kubernetes.io/projected/37d74c42-5045-4242-88ce-2915433b82d4-kube-api-access-wb6j7\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.722961 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:55:36.722869 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-serving-cert: secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 24 21:55:36.722961 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.722877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/37d74c42-5045-4242-88ce-2915433b82d4-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.722961 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:55:36.722959 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls podName:37d74c42-5045-4242-88ce-2915433b82d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:55:37.222938654 +0000 UTC m=+1738.070982751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls") pod "isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" (UID: "37d74c42-5045-4242-88ce-2915433b82d4") : secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 24 21:55:36.723316 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.723273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/37d74c42-5045-4242-88ce-2915433b82d4-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.723416 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.723399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/37d74c42-5045-4242-88ce-2915433b82d4-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:36.732157 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:36.732133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6j7\" (UniqueName: \"kubernetes.io/projected/37d74c42-5045-4242-88ce-2915433b82d4-kube-api-access-wb6j7\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:37.228087 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:37.228039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:37.230842 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:37.230812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:37.408047 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:37.408014 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:37.526989 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:37.526962 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4"] Apr 24 21:55:37.529602 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:55:37.529560 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37d74c42_5045_4242_88ce_2915433b82d4.slice/crio-4ce96a75aec91fe0d71d5e7c0feb3cbbf23bf41e242778040281fc7c86950045 WatchSource:0}: Error finding container 4ce96a75aec91fe0d71d5e7c0feb3cbbf23bf41e242778040281fc7c86950045: Status 404 returned error can't find the container with id 4ce96a75aec91fe0d71d5e7c0feb3cbbf23bf41e242778040281fc7c86950045 Apr 24 21:55:37.707283 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:37.707233 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerStarted","Data":"700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b"} Apr 24 21:55:37.707283 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:37.707270 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerStarted","Data":"4ce96a75aec91fe0d71d5e7c0feb3cbbf23bf41e242778040281fc7c86950045"} Apr 24 21:55:38.474752 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:38.474711 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 24 21:55:39.888087 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:39.888066 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:55:40.051761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.051733 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4229560-064c-4ecc-a4c0-a089494f96e8-kserve-provision-location\") pod \"c4229560-064c-4ecc-a4c0-a089494f96e8\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " Apr 24 21:55:40.051944 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.051792 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4229560-064c-4ecc-a4c0-a089494f96e8-proxy-tls\") pod \"c4229560-064c-4ecc-a4c0-a089494f96e8\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " Apr 24 21:55:40.051944 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.051838 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4229560-064c-4ecc-a4c0-a089494f96e8-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"c4229560-064c-4ecc-a4c0-a089494f96e8\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " Apr 24 21:55:40.052052 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.051953 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crbjw\" (UniqueName: \"kubernetes.io/projected/c4229560-064c-4ecc-a4c0-a089494f96e8-kube-api-access-crbjw\") pod \"c4229560-064c-4ecc-a4c0-a089494f96e8\" (UID: \"c4229560-064c-4ecc-a4c0-a089494f96e8\") " Apr 24 21:55:40.052140 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.052117 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4229560-064c-4ecc-a4c0-a089494f96e8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c4229560-064c-4ecc-a4c0-a089494f96e8" (UID: "c4229560-064c-4ecc-a4c0-a089494f96e8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:40.052211 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.052189 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4229560-064c-4ecc-a4c0-a089494f96e8-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "c4229560-064c-4ecc-a4c0-a089494f96e8" (UID: "c4229560-064c-4ecc-a4c0-a089494f96e8"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:55:40.052255 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.052208 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4229560-064c-4ecc-a4c0-a089494f96e8-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:55:40.054038 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.054020 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4229560-064c-4ecc-a4c0-a089494f96e8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c4229560-064c-4ecc-a4c0-a089494f96e8" (UID: "c4229560-064c-4ecc-a4c0-a089494f96e8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:55:40.054038 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.054022 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4229560-064c-4ecc-a4c0-a089494f96e8-kube-api-access-crbjw" (OuterVolumeSpecName: "kube-api-access-crbjw") pod "c4229560-064c-4ecc-a4c0-a089494f96e8" (UID: "c4229560-064c-4ecc-a4c0-a089494f96e8"). InnerVolumeSpecName "kube-api-access-crbjw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:55:40.152627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.152590 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4229560-064c-4ecc-a4c0-a089494f96e8-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:55:40.152627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.152622 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c4229560-064c-4ecc-a4c0-a089494f96e8-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:55:40.152627 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.152635 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crbjw\" (UniqueName: \"kubernetes.io/projected/c4229560-064c-4ecc-a4c0-a089494f96e8-kube-api-access-crbjw\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:55:40.717436 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.717405 2571 generic.go:358] "Generic (PLEG): container finished" podID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerID="a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82" exitCode=0 Apr 24 21:55:40.717602 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.717470 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" Apr 24 21:55:40.717602 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.717474 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerDied","Data":"a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82"} Apr 24 21:55:40.717602 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.717508 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss" event={"ID":"c4229560-064c-4ecc-a4c0-a089494f96e8","Type":"ContainerDied","Data":"0cbd27afbc61ddfe928d179fde7da198036242cbe42096cd486c2120ad52acd5"} Apr 24 21:55:40.717602 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.717523 2571 scope.go:117] "RemoveContainer" containerID="3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4" Apr 24 21:55:40.725596 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.725579 2571 scope.go:117] "RemoveContainer" containerID="a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82" Apr 24 21:55:40.732468 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.732451 2571 scope.go:117] "RemoveContainer" containerID="955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172" Apr 24 21:55:40.738775 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.738758 2571 scope.go:117] "RemoveContainer" containerID="3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4" Apr 24 21:55:40.739018 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:55:40.738999 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4\": container with ID starting with 3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4 not found: ID does not exist" containerID="3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4" Apr 24 21:55:40.739102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.739024 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4"} err="failed to get container status \"3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4\": rpc error: code = NotFound desc = could not find container \"3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4\": container with ID starting with 3fcb19072c15d8a41f949dd785ee724e2ab17d5851809610a42973cd907983f4 not found: ID does not exist" Apr 24 21:55:40.739102 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.739043 2571 scope.go:117] "RemoveContainer" containerID="a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82" Apr 24 21:55:40.739277 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:55:40.739262 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82\": container with ID starting with a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82 not found: ID does not exist" containerID="a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82" Apr 24 21:55:40.739371 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.739287 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82"} err="failed to get container status \"a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82\": rpc error: code = NotFound desc = could not find container \"a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82\": container with ID starting with a6796dea100a0ef19e88c4f4eb27e88cbb258e11580aba837d6954193eb77d82 not found: ID does not exist" Apr 24 21:55:40.739371 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.739320 2571 scope.go:117] "RemoveContainer" containerID="955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172" Apr 24 21:55:40.739537 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:55:40.739520 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172\": container with ID starting with 955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172 not found: ID does not exist" containerID="955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172" Apr 24 21:55:40.739578 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.739542 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172"} err="failed to get container status \"955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172\": rpc error: code = NotFound desc = could not find container \"955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172\": container with ID starting with 955180dee25eb73a867689e3b79b0af5eea737541bb66392e9441c764cd7e172 not found: ID does not exist" Apr 24 21:55:40.742009 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.741990 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss"] Apr 24 21:55:40.747550 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:40.747531 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-kz8ss"] Apr 24 21:55:41.647620 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:41.647586 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" path="/var/lib/kubelet/pods/c4229560-064c-4ecc-a4c0-a089494f96e8/volumes" Apr 24 21:55:41.721503 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:41.721471 2571 generic.go:358] "Generic (PLEG): container finished" podID="37d74c42-5045-4242-88ce-2915433b82d4" containerID="700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b" exitCode=0 Apr 24 21:55:41.721665 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:41.721545 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerDied","Data":"700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b"} Apr 24 21:55:42.727161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:42.727119 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerStarted","Data":"26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753"} Apr 24 21:55:42.727161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:42.727156 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerStarted","Data":"308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6"} Apr 24 21:55:42.727647 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:42.727405 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:42.727647 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:42.727434 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:42.728897 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:42.728869 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:55:42.765249 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:42.765198 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podStartSLOduration=6.765183253 podStartE2EDuration="6.765183253s" podCreationTimestamp="2026-04-24 21:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:55:42.764745204 +0000 UTC m=+1743.612789330" watchObservedRunningTime="2026-04-24 21:55:42.765183253 +0000 UTC m=+1743.613227369" Apr 24 21:55:43.729428 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:43.729382 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:55:48.733626 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:48.733591 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:55:48.734161 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:48.734135 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:55:58.735110 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:55:58.735073 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:56:08.734536 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:56:08.734496 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:56:18.734465 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:56:18.734419 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:56:28.735117 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:56:28.735032 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:56:38.734521 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:56:38.734486 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:56:48.734818 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:56:48.734781 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:56:55.644562 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:56:55.644527 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:57:05.647552 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:05.647520 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:57:07.413558 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.413526 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4"] Apr 24 21:57:07.413953 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.413816 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" containerID="cri-o://308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6" gracePeriod=30 Apr 24 21:57:07.413953 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.413829 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kube-rbac-proxy" containerID="cri-o://26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753" gracePeriod=30 Apr 24 21:57:07.534414 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534385 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944"] Apr 24 21:57:07.534666 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534654 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" Apr 24 21:57:07.534707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534667 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" Apr 24 21:57:07.534707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534686 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kube-rbac-proxy" Apr 24 21:57:07.534707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534691 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kube-rbac-proxy" Apr 24 21:57:07.534707 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534704 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="storage-initializer" Apr 24 21:57:07.534866 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534709 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="storage-initializer" Apr 24 21:57:07.534866 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534751 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kube-rbac-proxy" Apr 24 21:57:07.534866 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.534760 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4229560-064c-4ecc-a4c0-a089494f96e8" containerName="kserve-container" Apr 24 21:57:07.537527 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.537512 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.540002 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.539981 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-33ccfc-predictor-serving-cert\"" Apr 24 21:57:07.540456 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.540436 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-33ccfc-kube-rbac-proxy-sar-config\"" Apr 24 21:57:07.554405 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.554383 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944"] Apr 24 21:57:07.601468 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.601437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.702665 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.702568 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-isvc-primary-33ccfc-kube-rbac-proxy-sar-config\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.702665 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.702623 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.702889 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:57:07.702723 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-serving-cert: secret "isvc-primary-33ccfc-predictor-serving-cert" not found Apr 24 21:57:07.702889 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.702762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kserve-provision-location\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.702889 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:57:07.702799 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls podName:ea6d1be3-10a6-4e8b-8f29-2dc715792e41 nodeName:}" failed. No retries permitted until 2026-04-24 21:57:08.202776578 +0000 UTC m=+1829.050820684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls") pod "isvc-primary-33ccfc-predictor-7689d4bb45-vh944" (UID: "ea6d1be3-10a6-4e8b-8f29-2dc715792e41") : secret "isvc-primary-33ccfc-predictor-serving-cert" not found Apr 24 21:57:07.702889 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.702846 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvnb4\" (UniqueName: \"kubernetes.io/projected/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kube-api-access-nvnb4\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.803531 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.803493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kserve-provision-location\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.803531 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.803530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvnb4\" (UniqueName: \"kubernetes.io/projected/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kube-api-access-nvnb4\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.803788 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.803600 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-isvc-primary-33ccfc-kube-rbac-proxy-sar-config\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.803940 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.803914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kserve-provision-location\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.804188 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.804167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-isvc-primary-33ccfc-kube-rbac-proxy-sar-config\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.813643 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.813615 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvnb4\" (UniqueName: \"kubernetes.io/projected/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kube-api-access-nvnb4\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:07.954461 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.954384 2571 generic.go:358] "Generic (PLEG): container finished" podID="37d74c42-5045-4242-88ce-2915433b82d4" containerID="26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753" exitCode=2 Apr 24 21:57:07.954461 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:07.954452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerDied","Data":"26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753"} Apr 24 21:57:08.206675 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.206595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:08.209174 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.209143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls\") pod \"isvc-primary-33ccfc-predictor-7689d4bb45-vh944\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:08.447462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.447421 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:08.566922 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.566893 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944"] Apr 24 21:57:08.569888 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:57:08.569858 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6d1be3_10a6_4e8b_8f29_2dc715792e41.slice/crio-2747b44ebf2aa9f063b9b794b4f4173dffed2f195cb431d84de90ff4d0ccb904 WatchSource:0}: Error finding container 2747b44ebf2aa9f063b9b794b4f4173dffed2f195cb431d84de90ff4d0ccb904: Status 404 returned error can't find the container with id 2747b44ebf2aa9f063b9b794b4f4173dffed2f195cb431d84de90ff4d0ccb904 Apr 24 21:57:08.571691 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.571674 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:57:08.730367 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.730324 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:57:08.958982 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.958895 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerStarted","Data":"95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02"} Apr 24 21:57:08.958982 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:08.958938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerStarted","Data":"2747b44ebf2aa9f063b9b794b4f4173dffed2f195cb431d84de90ff4d0ccb904"} Apr 24 21:57:10.942065 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.942035 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:57:10.964912 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.964878 2571 generic.go:358] "Generic (PLEG): container finished" podID="37d74c42-5045-4242-88ce-2915433b82d4" containerID="308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6" exitCode=0 Apr 24 21:57:10.965026 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.964937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerDied","Data":"308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6"} Apr 24 21:57:10.965026 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.964980 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" Apr 24 21:57:10.965026 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.964998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4" event={"ID":"37d74c42-5045-4242-88ce-2915433b82d4","Type":"ContainerDied","Data":"4ce96a75aec91fe0d71d5e7c0feb3cbbf23bf41e242778040281fc7c86950045"} Apr 24 21:57:10.965026 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.965021 2571 scope.go:117] "RemoveContainer" containerID="26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753" Apr 24 21:57:10.973096 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.973071 2571 scope.go:117] "RemoveContainer" containerID="308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6" Apr 24 21:57:10.979988 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.979970 2571 scope.go:117] "RemoveContainer" containerID="700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b" Apr 24 21:57:10.986394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.986375 2571 scope.go:117] "RemoveContainer" containerID="26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753" Apr 24 21:57:10.986657 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:57:10.986640 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753\": container with ID starting with 26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753 not found: ID does not exist" containerID="26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753" Apr 24 21:57:10.986726 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.986665 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753"} err="failed to get container status \"26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753\": rpc error: code = NotFound desc = could not find container \"26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753\": container with ID starting with 26bcee3f22aae5f36156250ebd3dbd44605e52b54a08be4ac494c2f22bc77753 not found: ID does not exist" Apr 24 21:57:10.986726 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.986682 2571 scope.go:117] "RemoveContainer" containerID="308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6" Apr 24 21:57:10.986914 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:57:10.986897 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6\": container with ID starting with 308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6 not found: ID does not exist" containerID="308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6" Apr 24 21:57:10.986958 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.986922 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6"} err="failed to get container status \"308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6\": rpc error: code = NotFound desc = could not find container \"308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6\": container with ID starting with 308059ab4d355f777775f47871ce912b48fbca24bf40f26c24eeb0e889d673a6 not found: ID does not exist" Apr 24 21:57:10.986958 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.986937 2571 scope.go:117] "RemoveContainer" containerID="700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b" Apr 24 21:57:10.987159 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:57:10.987146 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b\": container with ID starting with 700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b not found: ID does not exist" containerID="700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b" Apr 24 21:57:10.987198 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:10.987161 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b"} err="failed to get container status \"700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b\": rpc error: code = NotFound desc = could not find container \"700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b\": container with ID starting with 700ede706038c884a64ee830cecd6c55c9b9c00b42bc3e2b86e07baf5600853b not found: ID does not exist" Apr 24 21:57:11.025311 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025273 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls\") pod \"37d74c42-5045-4242-88ce-2915433b82d4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " Apr 24 21:57:11.025432 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025318 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/37d74c42-5045-4242-88ce-2915433b82d4-kserve-provision-location\") pod \"37d74c42-5045-4242-88ce-2915433b82d4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " Apr 24 21:57:11.025432 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025338 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/37d74c42-5045-4242-88ce-2915433b82d4-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"37d74c42-5045-4242-88ce-2915433b82d4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " Apr 24 21:57:11.025503 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025452 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb6j7\" (UniqueName: \"kubernetes.io/projected/37d74c42-5045-4242-88ce-2915433b82d4-kube-api-access-wb6j7\") pod \"37d74c42-5045-4242-88ce-2915433b82d4\" (UID: \"37d74c42-5045-4242-88ce-2915433b82d4\") " Apr 24 21:57:11.025665 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025634 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d74c42-5045-4242-88ce-2915433b82d4-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "37d74c42-5045-4242-88ce-2915433b82d4" (UID: "37d74c42-5045-4242-88ce-2915433b82d4"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:57:11.025665 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025637 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37d74c42-5045-4242-88ce-2915433b82d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "37d74c42-5045-4242-88ce-2915433b82d4" (UID: "37d74c42-5045-4242-88ce-2915433b82d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:11.025788 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025715 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/37d74c42-5045-4242-88ce-2915433b82d4-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:57:11.025788 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.025730 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/37d74c42-5045-4242-88ce-2915433b82d4-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:57:11.027442 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.027416 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d74c42-5045-4242-88ce-2915433b82d4-kube-api-access-wb6j7" (OuterVolumeSpecName: "kube-api-access-wb6j7") pod "37d74c42-5045-4242-88ce-2915433b82d4" (UID: "37d74c42-5045-4242-88ce-2915433b82d4"). InnerVolumeSpecName "kube-api-access-wb6j7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:57:11.027542 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.027492 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "37d74c42-5045-4242-88ce-2915433b82d4" (UID: "37d74c42-5045-4242-88ce-2915433b82d4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:11.126188 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.126166 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37d74c42-5045-4242-88ce-2915433b82d4-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:57:11.126188 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.126187 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wb6j7\" (UniqueName: \"kubernetes.io/projected/37d74c42-5045-4242-88ce-2915433b82d4-kube-api-access-wb6j7\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:57:11.288901 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.288872 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4"] Apr 24 21:57:11.297829 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.297804 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hvjm4"] Apr 24 21:57:11.647522 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:11.647442 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d74c42-5045-4242-88ce-2915433b82d4" path="/var/lib/kubelet/pods/37d74c42-5045-4242-88ce-2915433b82d4/volumes" Apr 24 21:57:12.972263 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:12.972227 2571 generic.go:358] "Generic (PLEG): container finished" podID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerID="95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02" exitCode=0 Apr 24 21:57:12.972706 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:12.972315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerDied","Data":"95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02"} Apr 24 21:57:13.976823 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:13.976791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerStarted","Data":"514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698"} Apr 24 21:57:13.976823 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:13.976825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerStarted","Data":"8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0"} Apr 24 21:57:13.977237 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:13.977149 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:13.977318 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:13.977283 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:13.978383 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:13.978360 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:57:14.017172 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:14.017117 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podStartSLOduration=7.017102855 podStartE2EDuration="7.017102855s" podCreationTimestamp="2026-04-24 21:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:57:14.015092265 +0000 UTC m=+1834.863136405" watchObservedRunningTime="2026-04-24 21:57:14.017102855 +0000 UTC m=+1834.865146972" Apr 24 21:57:14.979919 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:14.979878 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:57:15.982764 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:15.982651 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:57:20.987225 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:20.987199 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:57:20.987667 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:20.987643 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:57:30.987630 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:30.987591 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:57:40.987727 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:40.987684 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:57:50.988248 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:57:50.988211 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:58:00.988495 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:00.988411 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:58:10.987889 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:10.987844 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:58:20.988444 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:20.988415 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:58:27.673045 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673012 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt"] Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673277 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kube-rbac-proxy" Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673287 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kube-rbac-proxy" Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673313 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="storage-initializer" Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673320 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="storage-initializer" Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673328 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673333 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673373 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kserve-container" Apr 24 21:58:27.673424 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.673385 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="37d74c42-5045-4242-88ce-2915433b82d4" containerName="kube-rbac-proxy" Apr 24 21:58:27.676399 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.676383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.678969 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.678935 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-33ccfc-predictor-serving-cert\"" Apr 24 21:58:27.678969 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.678947 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\"" Apr 24 21:58:27.679187 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.679170 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-33ccfc-dockercfg-xjngg\"" Apr 24 21:58:27.679253 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.679173 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 21:58:27.680074 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.680058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-33ccfc\"" Apr 24 21:58:27.687809 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.687791 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt"] Apr 24 21:58:27.802734 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.802700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-cabundle-cert\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.802914 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.802784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.802914 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.802832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99mdx\" (UniqueName: \"kubernetes.io/projected/0ed92957-a554-4f7a-b05e-757868f87520-kube-api-access-99mdx\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.802914 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.802858 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ed92957-a554-4f7a-b05e-757868f87520-proxy-tls\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.803070 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.802949 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ed92957-a554-4f7a-b05e-757868f87520-kserve-provision-location\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.903586 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.903551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.903763 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.903598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99mdx\" (UniqueName: \"kubernetes.io/projected/0ed92957-a554-4f7a-b05e-757868f87520-kube-api-access-99mdx\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.903763 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.903628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ed92957-a554-4f7a-b05e-757868f87520-proxy-tls\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.903763 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.903743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ed92957-a554-4f7a-b05e-757868f87520-kserve-provision-location\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.903925 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.903805 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-cabundle-cert\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.904178 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.904157 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ed92957-a554-4f7a-b05e-757868f87520-kserve-provision-location\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.904287 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.904227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.904434 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.904414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-cabundle-cert\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.906149 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.906124 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ed92957-a554-4f7a-b05e-757868f87520-proxy-tls\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.913199 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.913174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99mdx\" (UniqueName: \"kubernetes.io/projected/0ed92957-a554-4f7a-b05e-757868f87520-kube-api-access-99mdx\") pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:27.986674 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:27.986606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:28.106702 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:28.106680 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt"] Apr 24 21:58:28.108702 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:58:28.108672 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed92957_a554_4f7a_b05e_757868f87520.slice/crio-a07bfd2a1d9d7df9dee71ac2ac8f936a08d80a5c83a2d3c1966b05b1ada59f82 WatchSource:0}: Error finding container a07bfd2a1d9d7df9dee71ac2ac8f936a08d80a5c83a2d3c1966b05b1ada59f82: Status 404 returned error can't find the container with id a07bfd2a1d9d7df9dee71ac2ac8f936a08d80a5c83a2d3c1966b05b1ada59f82 Apr 24 21:58:28.179428 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:28.179400 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" event={"ID":"0ed92957-a554-4f7a-b05e-757868f87520","Type":"ContainerStarted","Data":"e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5"} Apr 24 21:58:28.179530 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:28.179439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" event={"ID":"0ed92957-a554-4f7a-b05e-757868f87520","Type":"ContainerStarted","Data":"a07bfd2a1d9d7df9dee71ac2ac8f936a08d80a5c83a2d3c1966b05b1ada59f82"} Apr 24 21:58:32.192012 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:32.191983 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_0ed92957-a554-4f7a-b05e-757868f87520/storage-initializer/0.log" Apr 24 21:58:32.192402 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:32.192022 2571 generic.go:358] "Generic (PLEG): container finished" podID="0ed92957-a554-4f7a-b05e-757868f87520" containerID="e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5" exitCode=1 Apr 24 21:58:32.192402 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:32.192063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" event={"ID":"0ed92957-a554-4f7a-b05e-757868f87520","Type":"ContainerDied","Data":"e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5"} Apr 24 21:58:33.196364 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:33.196337 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_0ed92957-a554-4f7a-b05e-757868f87520/storage-initializer/0.log" Apr 24 21:58:33.196745 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:33.196429 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" event={"ID":"0ed92957-a554-4f7a-b05e-757868f87520","Type":"ContainerStarted","Data":"84bacacf484798c5fb84d748fc73e8a6f8f6715270e794ccf71c50b0fe76386c"} Apr 24 21:58:38.217748 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:38.217719 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_0ed92957-a554-4f7a-b05e-757868f87520/storage-initializer/1.log" Apr 24 21:58:38.218206 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:38.218108 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_0ed92957-a554-4f7a-b05e-757868f87520/storage-initializer/0.log" Apr 24 21:58:38.218206 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:38.218149 2571 generic.go:358] "Generic (PLEG): container finished" podID="0ed92957-a554-4f7a-b05e-757868f87520" containerID="84bacacf484798c5fb84d748fc73e8a6f8f6715270e794ccf71c50b0fe76386c" exitCode=1 Apr 24 21:58:38.218324 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:38.218216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" event={"ID":"0ed92957-a554-4f7a-b05e-757868f87520","Type":"ContainerDied","Data":"84bacacf484798c5fb84d748fc73e8a6f8f6715270e794ccf71c50b0fe76386c"} Apr 24 21:58:38.218324 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:38.218253 2571 scope.go:117] "RemoveContainer" containerID="e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5" Apr 24 21:58:38.218657 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:38.218633 2571 scope.go:117] "RemoveContainer" containerID="e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5" Apr 24 21:58:38.228468 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:38.228445 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_kserve-ci-e2e-test_0ed92957-a554-4f7a-b05e-757868f87520_0 in pod sandbox a07bfd2a1d9d7df9dee71ac2ac8f936a08d80a5c83a2d3c1966b05b1ada59f82 from index: no such id: 'e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5'" containerID="e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5" Apr 24 21:58:38.228550 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:38.228492 2571 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_kserve-ci-e2e-test_0ed92957-a554-4f7a-b05e-757868f87520_0 in pod sandbox a07bfd2a1d9d7df9dee71ac2ac8f936a08d80a5c83a2d3c1966b05b1ada59f82 from index: no such id: 'e52527f077df2afb4780d4d959610b17ff6447fdd6c4e4707bb3461f561bf8c5'; Skipping pod \"isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_kserve-ci-e2e-test(0ed92957-a554-4f7a-b05e-757868f87520)\"" logger="UnhandledError" Apr 24 21:58:38.229798 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:38.229778 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_kserve-ci-e2e-test(0ed92957-a554-4f7a-b05e-757868f87520)\"" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" podUID="0ed92957-a554-4f7a-b05e-757868f87520" Apr 24 21:58:39.222535 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:39.222507 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_0ed92957-a554-4f7a-b05e-757868f87520/storage-initializer/1.log" Apr 24 21:58:43.785979 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.785947 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944"] Apr 24 21:58:43.786548 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.786284 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" containerID="cri-o://8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0" gracePeriod=30 Apr 24 21:58:43.786548 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.786451 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kube-rbac-proxy" containerID="cri-o://514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698" gracePeriod=30 Apr 24 21:58:43.889413 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.889381 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt"] Apr 24 21:58:43.957679 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.957649 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6"] Apr 24 21:58:43.961154 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.961134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:43.963785 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.963692 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-463e12\"" Apr 24 21:58:43.963973 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.963905 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-463e12-kube-rbac-proxy-sar-config\"" Apr 24 21:58:43.964083 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.964065 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-463e12-dockercfg-nfm44\"" Apr 24 21:58:43.964146 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.964099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-463e12-predictor-serving-cert\"" Apr 24 21:58:43.971847 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:43.971826 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6"] Apr 24 21:58:44.015063 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.015047 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_0ed92957-a554-4f7a-b05e-757868f87520/storage-initializer/1.log" Apr 24 21:58:44.015153 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.015105 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:44.027663 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027644 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ed92957-a554-4f7a-b05e-757868f87520-proxy-tls\") pod \"0ed92957-a554-4f7a-b05e-757868f87520\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " Apr 24 21:58:44.027732 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027678 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ed92957-a554-4f7a-b05e-757868f87520-kserve-provision-location\") pod \"0ed92957-a554-4f7a-b05e-757868f87520\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " Apr 24 21:58:44.027732 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027725 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99mdx\" (UniqueName: \"kubernetes.io/projected/0ed92957-a554-4f7a-b05e-757868f87520-kube-api-access-99mdx\") pod \"0ed92957-a554-4f7a-b05e-757868f87520\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " Apr 24 21:58:44.027835 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027771 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\") pod \"0ed92957-a554-4f7a-b05e-757868f87520\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " Apr 24 21:58:44.027835 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027799 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-cabundle-cert\") pod \"0ed92957-a554-4f7a-b05e-757868f87520\" (UID: \"0ed92957-a554-4f7a-b05e-757868f87520\") " Apr 24 21:58:44.027940 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-cabundle-cert\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.027993 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027950 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/410bd2b7-bd30-4c61-97ec-32c368502e45-kserve-provision-location\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.027993 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027949 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed92957-a554-4f7a-b05e-757868f87520-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ed92957-a554-4f7a-b05e-757868f87520" (UID: "0ed92957-a554-4f7a-b05e-757868f87520"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:58:44.027993 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.027980 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-463e12-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-isvc-init-fail-463e12-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.028118 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.028059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqfzr\" (UniqueName: \"kubernetes.io/projected/410bd2b7-bd30-4c61-97ec-32c368502e45-kube-api-access-nqfzr\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.028118 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.028103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410bd2b7-bd30-4c61-97ec-32c368502e45-proxy-tls\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.028201 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.028138 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-isvc-secondary-33ccfc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-33ccfc-kube-rbac-proxy-sar-config") pod "0ed92957-a554-4f7a-b05e-757868f87520" (UID: "0ed92957-a554-4f7a-b05e-757868f87520"). InnerVolumeSpecName "isvc-secondary-33ccfc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:44.028201 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.028181 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ed92957-a554-4f7a-b05e-757868f87520-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:44.028277 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.028191 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "0ed92957-a554-4f7a-b05e-757868f87520" (UID: "0ed92957-a554-4f7a-b05e-757868f87520"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:44.029863 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.029835 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed92957-a554-4f7a-b05e-757868f87520-kube-api-access-99mdx" (OuterVolumeSpecName: "kube-api-access-99mdx") pod "0ed92957-a554-4f7a-b05e-757868f87520" (UID: "0ed92957-a554-4f7a-b05e-757868f87520"). InnerVolumeSpecName "kube-api-access-99mdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:44.029971 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.029924 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed92957-a554-4f7a-b05e-757868f87520-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0ed92957-a554-4f7a-b05e-757868f87520" (UID: "0ed92957-a554-4f7a-b05e-757868f87520"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:44.129498 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410bd2b7-bd30-4c61-97ec-32c368502e45-proxy-tls\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.129498 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129482 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-cabundle-cert\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.129700 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129536 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/410bd2b7-bd30-4c61-97ec-32c368502e45-kserve-provision-location\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.129700 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-463e12-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-isvc-init-fail-463e12-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.129700 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqfzr\" (UniqueName: \"kubernetes.io/projected/410bd2b7-bd30-4c61-97ec-32c368502e45-kube-api-access-nqfzr\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.129700 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129660 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99mdx\" (UniqueName: \"kubernetes.io/projected/0ed92957-a554-4f7a-b05e-757868f87520-kube-api-access-99mdx\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:44.129700 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129680 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-isvc-secondary-33ccfc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:44.129700 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129695 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0ed92957-a554-4f7a-b05e-757868f87520-cabundle-cert\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:44.130001 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.129709 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ed92957-a554-4f7a-b05e-757868f87520-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:44.130051 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.130012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/410bd2b7-bd30-4c61-97ec-32c368502e45-kserve-provision-location\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.130411 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.130390 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-cabundle-cert\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.130585 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.130561 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-463e12-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-isvc-init-fail-463e12-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.131973 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.131956 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410bd2b7-bd30-4c61-97ec-32c368502e45-proxy-tls\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.139909 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.139890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqfzr\" (UniqueName: \"kubernetes.io/projected/410bd2b7-bd30-4c61-97ec-32c368502e45-kube-api-access-nqfzr\") pod \"isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.238166 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.238132 2571 generic.go:358] "Generic (PLEG): container finished" podID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerID="514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698" exitCode=2 Apr 24 21:58:44.238354 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.238207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerDied","Data":"514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698"} Apr 24 21:58:44.239239 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.239224 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt_0ed92957-a554-4f7a-b05e-757868f87520/storage-initializer/1.log" Apr 24 21:58:44.239286 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.239278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" event={"ID":"0ed92957-a554-4f7a-b05e-757868f87520","Type":"ContainerDied","Data":"a07bfd2a1d9d7df9dee71ac2ac8f936a08d80a5c83a2d3c1966b05b1ada59f82"} Apr 24 21:58:44.239345 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.239334 2571 scope.go:117] "RemoveContainer" containerID="84bacacf484798c5fb84d748fc73e8a6f8f6715270e794ccf71c50b0fe76386c" Apr 24 21:58:44.239383 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.239343 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt" Apr 24 21:58:44.271918 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.271800 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:44.283094 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.283072 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt"] Apr 24 21:58:44.287524 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.287504 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-33ccfc-predictor-9785c9d8b-r64mt"] Apr 24 21:58:44.400029 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:44.399996 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6"] Apr 24 21:58:44.401143 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:58:44.401114 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod410bd2b7_bd30_4c61_97ec_32c368502e45.slice/crio-4d00ab8405e40b92bfd4b664cd1840b44c33fc64a27ee418c64b94a0edfbbbb4 WatchSource:0}: Error finding container 4d00ab8405e40b92bfd4b664cd1840b44c33fc64a27ee418c64b94a0edfbbbb4: Status 404 returned error can't find the container with id 4d00ab8405e40b92bfd4b664cd1840b44c33fc64a27ee418c64b94a0edfbbbb4 Apr 24 21:58:45.246010 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:45.245973 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" event={"ID":"410bd2b7-bd30-4c61-97ec-32c368502e45","Type":"ContainerStarted","Data":"aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9"} Apr 24 21:58:45.246010 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:45.246014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" event={"ID":"410bd2b7-bd30-4c61-97ec-32c368502e45","Type":"ContainerStarted","Data":"4d00ab8405e40b92bfd4b664cd1840b44c33fc64a27ee418c64b94a0edfbbbb4"} Apr 24 21:58:45.647675 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:45.647597 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed92957-a554-4f7a-b05e-757868f87520" path="/var/lib/kubelet/pods/0ed92957-a554-4f7a-b05e-757868f87520/volumes" Apr 24 21:58:45.983797 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:45.983706 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 24 21:58:48.121237 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.121217 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:58:48.161483 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.161461 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvnb4\" (UniqueName: \"kubernetes.io/projected/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kube-api-access-nvnb4\") pod \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " Apr 24 21:58:48.161604 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.161495 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kserve-provision-location\") pod \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " Apr 24 21:58:48.161604 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.161574 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls\") pod \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " Apr 24 21:58:48.161604 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.161593 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-isvc-primary-33ccfc-kube-rbac-proxy-sar-config\") pod \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\" (UID: \"ea6d1be3-10a6-4e8b-8f29-2dc715792e41\") " Apr 24 21:58:48.161947 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.161911 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ea6d1be3-10a6-4e8b-8f29-2dc715792e41" (UID: "ea6d1be3-10a6-4e8b-8f29-2dc715792e41"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:58:48.162033 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.161972 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-isvc-primary-33ccfc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-33ccfc-kube-rbac-proxy-sar-config") pod "ea6d1be3-10a6-4e8b-8f29-2dc715792e41" (UID: "ea6d1be3-10a6-4e8b-8f29-2dc715792e41"). InnerVolumeSpecName "isvc-primary-33ccfc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:48.163639 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.163622 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kube-api-access-nvnb4" (OuterVolumeSpecName: "kube-api-access-nvnb4") pod "ea6d1be3-10a6-4e8b-8f29-2dc715792e41" (UID: "ea6d1be3-10a6-4e8b-8f29-2dc715792e41"). InnerVolumeSpecName "kube-api-access-nvnb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:48.163697 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.163688 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ea6d1be3-10a6-4e8b-8f29-2dc715792e41" (UID: "ea6d1be3-10a6-4e8b-8f29-2dc715792e41"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:48.256695 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.256663 2571 generic.go:358] "Generic (PLEG): container finished" podID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerID="8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0" exitCode=0 Apr 24 21:58:48.256871 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.256707 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerDied","Data":"8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0"} Apr 24 21:58:48.256871 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.256733 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" Apr 24 21:58:48.256871 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.256746 2571 scope.go:117] "RemoveContainer" containerID="514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698" Apr 24 21:58:48.256871 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.256733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944" event={"ID":"ea6d1be3-10a6-4e8b-8f29-2dc715792e41","Type":"ContainerDied","Data":"2747b44ebf2aa9f063b9b794b4f4173dffed2f195cb431d84de90ff4d0ccb904"} Apr 24 21:58:48.262317 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.262276 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvnb4\" (UniqueName: \"kubernetes.io/projected/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kube-api-access-nvnb4\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:48.262317 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.262316 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:48.262447 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.262331 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:48.262447 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.262346 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-33ccfc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea6d1be3-10a6-4e8b-8f29-2dc715792e41-isvc-primary-33ccfc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:48.264394 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.264372 2571 scope.go:117] "RemoveContainer" containerID="8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0" Apr 24 21:58:48.271196 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.271181 2571 scope.go:117] "RemoveContainer" containerID="95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02" Apr 24 21:58:48.278105 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.278085 2571 scope.go:117] "RemoveContainer" containerID="514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698" Apr 24 21:58:48.278403 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:48.278355 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698\": container with ID starting with 514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698 not found: ID does not exist" containerID="514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698" Apr 24 21:58:48.278462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.278412 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698"} err="failed to get container status \"514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698\": rpc error: code = NotFound desc = could not find container \"514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698\": container with ID starting with 514f7ca0e0ec2dd9687295df0f7a3341749ed2bea152ba2c957adaea87059698 not found: ID does not exist" Apr 24 21:58:48.278462 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.278428 2571 scope.go:117] "RemoveContainer" containerID="8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0" Apr 24 21:58:48.278691 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:48.278663 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0\": container with ID starting with 8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0 not found: ID does not exist" containerID="8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0" Apr 24 21:58:48.278742 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.278704 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0"} err="failed to get container status \"8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0\": rpc error: code = NotFound desc = could not find container \"8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0\": container with ID starting with 8d5b5f42b9156d732f3125809082d6e454cd88a08053c412d18c512f73bd5be0 not found: ID does not exist" Apr 24 21:58:48.278742 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.278727 2571 scope.go:117] "RemoveContainer" containerID="95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02" Apr 24 21:58:48.278966 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:48.278950 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02\": container with ID starting with 95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02 not found: ID does not exist" containerID="95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02" Apr 24 21:58:48.279010 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.278971 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02"} err="failed to get container status \"95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02\": rpc error: code = NotFound desc = could not find container \"95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02\": container with ID starting with 95ba82d75dfdd0ef88b468b57788b0af2c9c128114a2fe61155b5290cc059e02 not found: ID does not exist" Apr 24 21:58:48.279055 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.279039 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944"] Apr 24 21:58:48.284946 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:48.284926 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-33ccfc-predictor-7689d4bb45-vh944"] Apr 24 21:58:49.652311 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:49.652259 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" path="/var/lib/kubelet/pods/ea6d1be3-10a6-4e8b-8f29-2dc715792e41/volumes" Apr 24 21:58:50.264735 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:50.264710 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_410bd2b7-bd30-4c61-97ec-32c368502e45/storage-initializer/0.log" Apr 24 21:58:50.264920 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:50.264748 2571 generic.go:358] "Generic (PLEG): container finished" podID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerID="aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9" exitCode=1 Apr 24 21:58:50.264920 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:50.264794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" event={"ID":"410bd2b7-bd30-4c61-97ec-32c368502e45","Type":"ContainerDied","Data":"aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9"} Apr 24 21:58:51.269359 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:51.269332 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_410bd2b7-bd30-4c61-97ec-32c368502e45/storage-initializer/0.log" Apr 24 21:58:51.269740 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:51.269405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" event={"ID":"410bd2b7-bd30-4c61-97ec-32c368502e45","Type":"ContainerStarted","Data":"971a99aa2f2c55a9d290f12dfe75354ef44db66afec419cfccfcccfd2461a5af"} Apr 24 21:58:53.275839 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.275811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_410bd2b7-bd30-4c61-97ec-32c368502e45/storage-initializer/1.log" Apr 24 21:58:53.276205 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.276180 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_410bd2b7-bd30-4c61-97ec-32c368502e45/storage-initializer/0.log" Apr 24 21:58:53.276251 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.276211 2571 generic.go:358] "Generic (PLEG): container finished" podID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerID="971a99aa2f2c55a9d290f12dfe75354ef44db66afec419cfccfcccfd2461a5af" exitCode=1 Apr 24 21:58:53.276251 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.276242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" event={"ID":"410bd2b7-bd30-4c61-97ec-32c368502e45","Type":"ContainerDied","Data":"971a99aa2f2c55a9d290f12dfe75354ef44db66afec419cfccfcccfd2461a5af"} Apr 24 21:58:53.276370 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.276267 2571 scope.go:117] "RemoveContainer" containerID="aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9" Apr 24 21:58:53.276655 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.276640 2571 scope.go:117] "RemoveContainer" containerID="aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9" Apr 24 21:58:53.286837 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:53.286800 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_kserve-ci-e2e-test_410bd2b7-bd30-4c61-97ec-32c368502e45_0 in pod sandbox 4d00ab8405e40b92bfd4b664cd1840b44c33fc64a27ee418c64b94a0edfbbbb4 from index: no such id: 'aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9'" containerID="aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9" Apr 24 21:58:53.286931 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.286846 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_kserve-ci-e2e-test_410bd2b7-bd30-4c61-97ec-32c368502e45_0 in pod sandbox 4d00ab8405e40b92bfd4b664cd1840b44c33fc64a27ee418c64b94a0edfbbbb4 from index: no such id: 'aa077517ae58322498dfb193e9b7985901d747c0e578b50c1dcfffad1f14a6a9'" Apr 24 21:58:53.287036 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:53.287017 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_kserve-ci-e2e-test(410bd2b7-bd30-4c61-97ec-32c368502e45)\"" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" Apr 24 21:58:53.923413 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:53.923374 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6"] Apr 24 21:58:54.072307 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072270 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z"] Apr 24 21:58:54.072570 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072558 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="storage-initializer" Apr 24 21:58:54.072611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072572 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="storage-initializer" Apr 24 21:58:54.072611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072586 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kube-rbac-proxy" Apr 24 21:58:54.072611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072591 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kube-rbac-proxy" Apr 24 21:58:54.072611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072598 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ed92957-a554-4f7a-b05e-757868f87520" containerName="storage-initializer" Apr 24 21:58:54.072611 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072604 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed92957-a554-4f7a-b05e-757868f87520" containerName="storage-initializer" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072616 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072621 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072628 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ed92957-a554-4f7a-b05e-757868f87520" containerName="storage-initializer" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072633 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed92957-a554-4f7a-b05e-757868f87520" containerName="storage-initializer" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072672 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ed92957-a554-4f7a-b05e-757868f87520" containerName="storage-initializer" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072683 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ed92957-a554-4f7a-b05e-757868f87520" containerName="storage-initializer" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072690 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kube-rbac-proxy" Apr 24 21:58:54.072761 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.072697 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea6d1be3-10a6-4e8b-8f29-2dc715792e41" containerName="kserve-container" Apr 24 21:58:54.077017 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.076995 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.079865 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.079841 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 21:58:54.079989 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.079866 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 24 21:58:54.079989 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.079874 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-j8kq7\"" Apr 24 21:58:54.085695 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.085674 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z"] Apr 24 21:58:54.107535 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.107499 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.107658 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.107549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d0950ca1-5a7f-4ec0-8876-74ebf661a694-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.107658 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.107599 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.107658 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.107630 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55ndf\" (UniqueName: \"kubernetes.io/projected/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kube-api-access-55ndf\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.208330 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.208235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55ndf\" (UniqueName: \"kubernetes.io/projected/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kube-api-access-55ndf\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.208330 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.208325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.208536 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.208360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d0950ca1-5a7f-4ec0-8876-74ebf661a694-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.208536 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.208388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.208536 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:54.208483 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 24 21:58:54.208664 ip-10-0-139-5 kubenswrapper[2571]: E0424 21:58:54.208543 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls podName:d0950ca1-5a7f-4ec0-8876-74ebf661a694 nodeName:}" failed. No retries permitted until 2026-04-24 21:58:54.708524153 +0000 UTC m=+1935.556568246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" (UID: "d0950ca1-5a7f-4ec0-8876-74ebf661a694") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 24 21:58:54.208793 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.208762 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.209043 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.209023 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d0950ca1-5a7f-4ec0-8876-74ebf661a694-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.219457 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.219431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55ndf\" (UniqueName: \"kubernetes.io/projected/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kube-api-access-55ndf\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.280047 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.280019 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_410bd2b7-bd30-4c61-97ec-32c368502e45/storage-initializer/1.log" Apr 24 21:58:54.397927 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.397890 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_410bd2b7-bd30-4c61-97ec-32c368502e45/storage-initializer/1.log" Apr 24 21:58:54.398034 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.397971 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:54.510931 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.510902 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqfzr\" (UniqueName: \"kubernetes.io/projected/410bd2b7-bd30-4c61-97ec-32c368502e45-kube-api-access-nqfzr\") pod \"410bd2b7-bd30-4c61-97ec-32c368502e45\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " Apr 24 21:58:54.511086 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.510940 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410bd2b7-bd30-4c61-97ec-32c368502e45-proxy-tls\") pod \"410bd2b7-bd30-4c61-97ec-32c368502e45\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " Apr 24 21:58:54.511086 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.510985 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-cabundle-cert\") pod \"410bd2b7-bd30-4c61-97ec-32c368502e45\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " Apr 24 21:58:54.511086 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.511021 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/410bd2b7-bd30-4c61-97ec-32c368502e45-kserve-provision-location\") pod \"410bd2b7-bd30-4c61-97ec-32c368502e45\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " Apr 24 21:58:54.511086 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.511045 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-463e12-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-isvc-init-fail-463e12-kube-rbac-proxy-sar-config\") pod \"410bd2b7-bd30-4c61-97ec-32c368502e45\" (UID: \"410bd2b7-bd30-4c61-97ec-32c368502e45\") " Apr 24 21:58:54.511387 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.511356 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410bd2b7-bd30-4c61-97ec-32c368502e45-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "410bd2b7-bd30-4c61-97ec-32c368502e45" (UID: "410bd2b7-bd30-4c61-97ec-32c368502e45"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:58:54.511446 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.511408 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "410bd2b7-bd30-4c61-97ec-32c368502e45" (UID: "410bd2b7-bd30-4c61-97ec-32c368502e45"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:54.511494 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.511456 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-isvc-init-fail-463e12-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-463e12-kube-rbac-proxy-sar-config") pod "410bd2b7-bd30-4c61-97ec-32c368502e45" (UID: "410bd2b7-bd30-4c61-97ec-32c368502e45"). InnerVolumeSpecName "isvc-init-fail-463e12-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:54.513259 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.513236 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410bd2b7-bd30-4c61-97ec-32c368502e45-kube-api-access-nqfzr" (OuterVolumeSpecName: "kube-api-access-nqfzr") pod "410bd2b7-bd30-4c61-97ec-32c368502e45" (UID: "410bd2b7-bd30-4c61-97ec-32c368502e45"). InnerVolumeSpecName "kube-api-access-nqfzr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:54.513319 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.513238 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410bd2b7-bd30-4c61-97ec-32c368502e45-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "410bd2b7-bd30-4c61-97ec-32c368502e45" (UID: "410bd2b7-bd30-4c61-97ec-32c368502e45"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:54.611939 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.611892 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nqfzr\" (UniqueName: \"kubernetes.io/projected/410bd2b7-bd30-4c61-97ec-32c368502e45-kube-api-access-nqfzr\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:54.611939 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.611933 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410bd2b7-bd30-4c61-97ec-32c368502e45-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:54.611939 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.611943 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-cabundle-cert\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:54.611939 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.611952 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/410bd2b7-bd30-4c61-97ec-32c368502e45-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:54.612197 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.611961 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-463e12-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/410bd2b7-bd30-4c61-97ec-32c368502e45-isvc-init-fail-463e12-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 21:58:54.712765 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.712725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.715345 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.715326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:54.988585 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:54.988502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:58:55.119928 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.119842 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z"] Apr 24 21:58:55.122618 ip-10-0-139-5 kubenswrapper[2571]: W0424 21:58:55.122589 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0950ca1_5a7f_4ec0_8876_74ebf661a694.slice/crio-b1a1eebe114649440b3b77dd3528737dc67efb05dfccf8a0661453fa6073cef3 WatchSource:0}: Error finding container b1a1eebe114649440b3b77dd3528737dc67efb05dfccf8a0661453fa6073cef3: Status 404 returned error can't find the container with id b1a1eebe114649440b3b77dd3528737dc67efb05dfccf8a0661453fa6073cef3 Apr 24 21:58:55.284573 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.284534 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerStarted","Data":"8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a"} Apr 24 21:58:55.284573 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.284575 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerStarted","Data":"b1a1eebe114649440b3b77dd3528737dc67efb05dfccf8a0661453fa6073cef3"} Apr 24 21:58:55.285754 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.285736 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6_410bd2b7-bd30-4c61-97ec-32c368502e45/storage-initializer/1.log" Apr 24 21:58:55.285846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.285790 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" event={"ID":"410bd2b7-bd30-4c61-97ec-32c368502e45","Type":"ContainerDied","Data":"4d00ab8405e40b92bfd4b664cd1840b44c33fc64a27ee418c64b94a0edfbbbb4"} Apr 24 21:58:55.285846 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.285827 2571 scope.go:117] "RemoveContainer" containerID="971a99aa2f2c55a9d290f12dfe75354ef44db66afec419cfccfcccfd2461a5af" Apr 24 21:58:55.285967 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.285865 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6" Apr 24 21:58:55.340360 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.340329 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6"] Apr 24 21:58:55.345976 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.345950 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-463e12-predictor-58bcf495d8-qzpv6"] Apr 24 21:58:55.647618 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:55.647525 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" path="/var/lib/kubelet/pods/410bd2b7-bd30-4c61-97ec-32c368502e45/volumes" Apr 24 21:58:59.298104 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:59.298072 2571 generic.go:358] "Generic (PLEG): container finished" podID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerID="8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a" exitCode=0 Apr 24 21:58:59.298554 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:58:59.298151 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerDied","Data":"8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a"} Apr 24 21:59:17.358559 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:17.358526 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerStarted","Data":"485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5"} Apr 24 21:59:17.358559 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:17.358565 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerStarted","Data":"d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf"} Apr 24 21:59:17.358973 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:17.358765 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:59:17.385054 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:17.385010 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podStartSLOduration=6.260930164 podStartE2EDuration="23.384996614s" podCreationTimestamp="2026-04-24 21:58:54 +0000 UTC" firstStartedPulling="2026-04-24 21:58:59.299250337 +0000 UTC m=+1940.147294430" lastFinishedPulling="2026-04-24 21:59:16.423316786 +0000 UTC m=+1957.271360880" observedRunningTime="2026-04-24 21:59:17.383183033 +0000 UTC m=+1958.231227150" watchObservedRunningTime="2026-04-24 21:59:17.384996614 +0000 UTC m=+1958.233040729" Apr 24 21:59:18.361350 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:18.361315 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:59:18.362708 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:18.362679 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:59:19.363767 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:19.363728 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:59:24.367879 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:24.367849 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 21:59:24.368448 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:24.368422 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:59:34.368814 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:34.368762 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:59:44.368688 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:44.368649 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:59:54.368511 ip-10-0-139-5 kubenswrapper[2571]: I0424 21:59:54.368470 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 22:00:04.368740 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:04.368697 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 22:00:14.369083 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:14.369044 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 22:00:24.368998 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:24.368956 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 22:00:30.644496 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:30.644466 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 22:00:34.261223 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.261187 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z"] Apr 24 22:00:34.261637 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.261471 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" containerID="cri-o://d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf" gracePeriod=30 Apr 24 22:00:34.261637 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.261531 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kube-rbac-proxy" containerID="cri-o://485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5" gracePeriod=30 Apr 24 22:00:34.356672 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.356642 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6"] Apr 24 22:00:34.357010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.356983 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerName="storage-initializer" Apr 24 22:00:34.357010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.357002 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerName="storage-initializer" Apr 24 22:00:34.357174 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.357071 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerName="storage-initializer" Apr 24 22:00:34.357174 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.357082 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerName="storage-initializer" Apr 24 22:00:34.357174 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.357150 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerName="storage-initializer" Apr 24 22:00:34.357174 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.357159 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="410bd2b7-bd30-4c61-97ec-32c368502e45" containerName="storage-initializer" Apr 24 22:00:34.360287 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.360272 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.362835 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.362813 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 24 22:00:34.363038 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.363024 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 22:00:34.364111 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.364082 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 24 22:00:34.372544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.372526 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6"] Apr 24 22:00:34.510948 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.510916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3a24a9-55b1-4279-aa96-2f627a013eb2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.510948 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.510954 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.511177 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.510979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6gh\" (UniqueName: \"kubernetes.io/projected/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kube-api-access-lw6gh\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.511177 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.511038 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3a24a9-55b1-4279-aa96-2f627a013eb2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.583839 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.583762 2571 generic.go:358] "Generic (PLEG): container finished" podID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerID="485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5" exitCode=2 Apr 24 22:00:34.583839 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.583832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerDied","Data":"485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5"} Apr 24 22:00:34.611593 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.611562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3a24a9-55b1-4279-aa96-2f627a013eb2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.611718 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.611635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3a24a9-55b1-4279-aa96-2f627a013eb2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.611718 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.611670 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.611718 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.611698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6gh\" (UniqueName: \"kubernetes.io/projected/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kube-api-access-lw6gh\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.612076 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.612052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.612349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.612327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3a24a9-55b1-4279-aa96-2f627a013eb2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.614069 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.614051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3a24a9-55b1-4279-aa96-2f627a013eb2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.621094 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.621069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6gh\" (UniqueName: \"kubernetes.io/projected/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kube-api-access-lw6gh\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.670855 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.670823 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:34.794175 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:34.794142 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6"] Apr 24 22:00:34.797838 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:00:34.797807 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc3a24a9_55b1_4279_aa96_2f627a013eb2.slice/crio-8b8aa1b4afe78cf4626bbce67cc3b108153689e5911ef00c925f0ae53172ea7b WatchSource:0}: Error finding container 8b8aa1b4afe78cf4626bbce67cc3b108153689e5911ef00c925f0ae53172ea7b: Status 404 returned error can't find the container with id 8b8aa1b4afe78cf4626bbce67cc3b108153689e5911ef00c925f0ae53172ea7b Apr 24 22:00:35.588190 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:35.588144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerStarted","Data":"d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff"} Apr 24 22:00:35.588190 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:35.588191 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerStarted","Data":"8b8aa1b4afe78cf4626bbce67cc3b108153689e5911ef00c925f0ae53172ea7b"} Apr 24 22:00:38.598666 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.598579 2571 generic.go:358] "Generic (PLEG): container finished" podID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerID="d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff" exitCode=0 Apr 24 22:00:38.598666 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.598636 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerDied","Data":"d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff"} Apr 24 22:00:38.893873 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.893852 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 22:00:38.946756 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.946726 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d0950ca1-5a7f-4ec0-8876-74ebf661a694-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " Apr 24 22:00:38.946913 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.946788 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kserve-provision-location\") pod \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " Apr 24 22:00:38.946913 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.946838 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls\") pod \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " Apr 24 22:00:38.946913 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.946877 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55ndf\" (UniqueName: \"kubernetes.io/projected/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kube-api-access-55ndf\") pod \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\" (UID: \"d0950ca1-5a7f-4ec0-8876-74ebf661a694\") " Apr 24 22:00:38.947104 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.947076 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d0950ca1-5a7f-4ec0-8876-74ebf661a694" (UID: "d0950ca1-5a7f-4ec0-8876-74ebf661a694"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:38.947147 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.947115 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0950ca1-5a7f-4ec0-8876-74ebf661a694-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "d0950ca1-5a7f-4ec0-8876-74ebf661a694" (UID: "d0950ca1-5a7f-4ec0-8876-74ebf661a694"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:00:38.948931 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.948911 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d0950ca1-5a7f-4ec0-8876-74ebf661a694" (UID: "d0950ca1-5a7f-4ec0-8876-74ebf661a694"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:38.949019 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:38.949005 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kube-api-access-55ndf" (OuterVolumeSpecName: "kube-api-access-55ndf") pod "d0950ca1-5a7f-4ec0-8876-74ebf661a694" (UID: "d0950ca1-5a7f-4ec0-8876-74ebf661a694"). InnerVolumeSpecName "kube-api-access-55ndf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:39.047392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.047355 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:00:39.047392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.047385 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0950ca1-5a7f-4ec0-8876-74ebf661a694-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:00:39.047392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.047394 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-55ndf\" (UniqueName: \"kubernetes.io/projected/d0950ca1-5a7f-4ec0-8876-74ebf661a694-kube-api-access-55ndf\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:00:39.047611 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.047404 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d0950ca1-5a7f-4ec0-8876-74ebf661a694-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:00:39.603340 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.603229 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerStarted","Data":"764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf"} Apr 24 22:00:39.603340 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.603272 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerStarted","Data":"c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a"} Apr 24 22:00:39.603838 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.603592 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:39.603838 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.603706 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:39.605045 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.605021 2571 generic.go:358] "Generic (PLEG): container finished" podID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerID="d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf" exitCode=0 Apr 24 22:00:39.605045 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.605036 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:00:39.605239 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.605082 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerDied","Data":"d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf"} Apr 24 22:00:39.605239 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.605108 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" event={"ID":"d0950ca1-5a7f-4ec0-8876-74ebf661a694","Type":"ContainerDied","Data":"b1a1eebe114649440b3b77dd3528737dc67efb05dfccf8a0661453fa6073cef3"} Apr 24 22:00:39.605239 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.605111 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z" Apr 24 22:00:39.605239 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.605125 2571 scope.go:117] "RemoveContainer" containerID="485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5" Apr 24 22:00:39.612957 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.612769 2571 scope.go:117] "RemoveContainer" containerID="d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf" Apr 24 22:00:39.619675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.619657 2571 scope.go:117] "RemoveContainer" containerID="8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a" Apr 24 22:00:39.626503 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.626456 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podStartSLOduration=5.626446305 podStartE2EDuration="5.626446305s" podCreationTimestamp="2026-04-24 22:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:00:39.624810638 +0000 UTC m=+2040.472854751" watchObservedRunningTime="2026-04-24 22:00:39.626446305 +0000 UTC m=+2040.474490422" Apr 24 22:00:39.626896 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.626871 2571 scope.go:117] "RemoveContainer" containerID="485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5" Apr 24 22:00:39.627154 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:00:39.627130 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5\": container with ID starting with 485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5 not found: ID does not exist" containerID="485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5" Apr 24 22:00:39.627269 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.627164 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5"} err="failed to get container status \"485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5\": rpc error: code = NotFound desc = could not find container \"485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5\": container with ID starting with 485ba56a46712767b279dad88f5d3446c1846a35f78d5b106e6ca8ffad4708e5 not found: ID does not exist" Apr 24 22:00:39.627269 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.627182 2571 scope.go:117] "RemoveContainer" containerID="d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf" Apr 24 22:00:39.627480 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:00:39.627465 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf\": container with ID starting with d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf not found: ID does not exist" containerID="d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf" Apr 24 22:00:39.627537 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.627485 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf"} err="failed to get container status \"d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf\": rpc error: code = NotFound desc = could not find container \"d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf\": container with ID starting with d2b96e286b910843aa79e2e0cdaf056601df1fc1410482c91dab8149e0ac3fdf not found: ID does not exist" Apr 24 22:00:39.627537 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.627498 2571 scope.go:117] "RemoveContainer" containerID="8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a" Apr 24 22:00:39.627763 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:00:39.627746 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a\": container with ID starting with 8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a not found: ID does not exist" containerID="8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a" Apr 24 22:00:39.627811 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.627768 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a"} err="failed to get container status \"8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a\": rpc error: code = NotFound desc = could not find container \"8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a\": container with ID starting with 8ce6eec1b00b8da62fe0215a0bffd84ce5bdb62d1e83353c97892f3c1b958c2a not found: ID does not exist" Apr 24 22:00:39.638438 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.638419 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z"] Apr 24 22:00:39.647729 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:39.647708 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9l4z"] Apr 24 22:00:40.609137 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:40.609092 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:00:41.647142 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:41.647105 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" path="/var/lib/kubelet/pods/d0950ca1-5a7f-4ec0-8876-74ebf661a694/volumes" Apr 24 22:00:45.614059 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:45.614027 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:00:45.614684 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:45.614653 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:00:55.614939 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:00:55.614899 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:01:05.614598 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:01:05.614514 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:01:15.615389 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:01:15.615345 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:01:25.615079 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:01:25.615038 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:01:35.615141 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:01:35.615101 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:01:45.615334 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:01:45.615271 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:01:55.615460 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:01:55.615421 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:02:04.440030 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.439994 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6"] Apr 24 22:02:04.440612 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.440426 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" containerID="cri-o://c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a" gracePeriod=30 Apr 24 22:02:04.440761 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.440490 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kube-rbac-proxy" containerID="cri-o://764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf" gracePeriod=30 Apr 24 22:02:04.562815 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.562784 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl"] Apr 24 22:02:04.563078 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563067 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="storage-initializer" Apr 24 22:02:04.563126 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563080 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="storage-initializer" Apr 24 22:02:04.563126 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563094 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" Apr 24 22:02:04.563126 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563100 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" Apr 24 22:02:04.563126 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563110 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kube-rbac-proxy" Apr 24 22:02:04.563126 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563115 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kube-rbac-proxy" Apr 24 22:02:04.563308 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563161 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kserve-container" Apr 24 22:02:04.563308 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.563171 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0950ca1-5a7f-4ec0-8876-74ebf661a694" containerName="kube-rbac-proxy" Apr 24 22:02:04.566010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.565985 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.569342 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.569318 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 22:02:04.569450 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.569322 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 24 22:02:04.576974 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.576946 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl"] Apr 24 22:02:04.590662 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.590629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfbng\" (UniqueName: \"kubernetes.io/projected/ce611f4e-61d2-4f28-a350-e7c41563bc82-kube-api-access-bfbng\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.590847 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.590745 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce611f4e-61d2-4f28-a350-e7c41563bc82-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.590847 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.590792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce611f4e-61d2-4f28-a350-e7c41563bc82-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.590926 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.590860 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce611f4e-61d2-4f28-a350-e7c41563bc82-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.692054 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.691954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfbng\" (UniqueName: \"kubernetes.io/projected/ce611f4e-61d2-4f28-a350-e7c41563bc82-kube-api-access-bfbng\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.692054 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.692024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce611f4e-61d2-4f28-a350-e7c41563bc82-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.692269 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.692056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce611f4e-61d2-4f28-a350-e7c41563bc82-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.692269 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.692089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce611f4e-61d2-4f28-a350-e7c41563bc82-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.692542 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.692519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce611f4e-61d2-4f28-a350-e7c41563bc82-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.692798 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.692777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce611f4e-61d2-4f28-a350-e7c41563bc82-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.694821 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.694796 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce611f4e-61d2-4f28-a350-e7c41563bc82-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.700781 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.700749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfbng\" (UniqueName: \"kubernetes.io/projected/ce611f4e-61d2-4f28-a350-e7c41563bc82-kube-api-access-bfbng\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:04.837921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.837887 2571 generic.go:358] "Generic (PLEG): container finished" podID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerID="764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf" exitCode=2 Apr 24 22:02:04.838091 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.837942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerDied","Data":"764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf"} Apr 24 22:02:04.876261 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:04.876231 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:05.003713 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:05.003530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl"] Apr 24 22:02:05.006324 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:02:05.006274 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce611f4e_61d2_4f28_a350_e7c41563bc82.slice/crio-3270aec8576a82a0ef97a145dd205a484163421a835642079ccc918c207382b3 WatchSource:0}: Error finding container 3270aec8576a82a0ef97a145dd205a484163421a835642079ccc918c207382b3: Status 404 returned error can't find the container with id 3270aec8576a82a0ef97a145dd205a484163421a835642079ccc918c207382b3 Apr 24 22:02:05.609682 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:05.609638 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 24 22:02:05.614607 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:05.614578 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 22:02:05.842157 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:05.842124 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerStarted","Data":"744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17"} Apr 24 22:02:05.842157 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:05.842162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerStarted","Data":"3270aec8576a82a0ef97a145dd205a484163421a835642079ccc918c207382b3"} Apr 24 22:02:08.850733 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:08.850697 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerID="744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17" exitCode=0 Apr 24 22:02:08.851194 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:08.850778 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerDied","Data":"744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17"} Apr 24 22:02:08.852038 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:08.852022 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:02:09.370079 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.370054 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:02:09.430201 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.430169 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw6gh\" (UniqueName: \"kubernetes.io/projected/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kube-api-access-lw6gh\") pod \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " Apr 24 22:02:09.430383 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.430236 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kserve-provision-location\") pod \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " Apr 24 22:02:09.430383 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.430320 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3a24a9-55b1-4279-aa96-2f627a013eb2-proxy-tls\") pod \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " Apr 24 22:02:09.430383 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.430349 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3a24a9-55b1-4279-aa96-2f627a013eb2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\" (UID: \"fc3a24a9-55b1-4279-aa96-2f627a013eb2\") " Apr 24 22:02:09.430601 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.430577 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fc3a24a9-55b1-4279-aa96-2f627a013eb2" (UID: "fc3a24a9-55b1-4279-aa96-2f627a013eb2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:02:09.430808 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.430781 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc3a24a9-55b1-4279-aa96-2f627a013eb2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "fc3a24a9-55b1-4279-aa96-2f627a013eb2" (UID: "fc3a24a9-55b1-4279-aa96-2f627a013eb2"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:02:09.432532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.432505 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kube-api-access-lw6gh" (OuterVolumeSpecName: "kube-api-access-lw6gh") pod "fc3a24a9-55b1-4279-aa96-2f627a013eb2" (UID: "fc3a24a9-55b1-4279-aa96-2f627a013eb2"). InnerVolumeSpecName "kube-api-access-lw6gh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:02:09.432643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.432534 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3a24a9-55b1-4279-aa96-2f627a013eb2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fc3a24a9-55b1-4279-aa96-2f627a013eb2" (UID: "fc3a24a9-55b1-4279-aa96-2f627a013eb2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:02:09.531524 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.531480 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3a24a9-55b1-4279-aa96-2f627a013eb2-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:02:09.531524 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.531513 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3a24a9-55b1-4279-aa96-2f627a013eb2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:02:09.531524 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.531523 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lw6gh\" (UniqueName: \"kubernetes.io/projected/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kube-api-access-lw6gh\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:02:09.531524 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.531534 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3a24a9-55b1-4279-aa96-2f627a013eb2-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:02:09.855040 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.854943 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerStarted","Data":"f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df"} Apr 24 22:02:09.855040 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.854981 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerStarted","Data":"31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7"} Apr 24 22:02:09.855560 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.855320 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:09.855560 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.855414 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:09.856626 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.856581 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:02:09.856786 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.856755 2571 generic.go:358] "Generic (PLEG): container finished" podID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerID="c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a" exitCode=0 Apr 24 22:02:09.856857 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.856790 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerDied","Data":"c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a"} Apr 24 22:02:09.856857 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.856835 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" event={"ID":"fc3a24a9-55b1-4279-aa96-2f627a013eb2","Type":"ContainerDied","Data":"8b8aa1b4afe78cf4626bbce67cc3b108153689e5911ef00c925f0ae53172ea7b"} Apr 24 22:02:09.856857 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.856848 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6" Apr 24 22:02:09.856964 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.856851 2571 scope.go:117] "RemoveContainer" containerID="764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf" Apr 24 22:02:09.864938 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.864920 2571 scope.go:117] "RemoveContainer" containerID="c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a" Apr 24 22:02:09.871844 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.871827 2571 scope.go:117] "RemoveContainer" containerID="d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff" Apr 24 22:02:09.876125 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.876080 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podStartSLOduration=5.876066898 podStartE2EDuration="5.876066898s" podCreationTimestamp="2026-04-24 22:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:02:09.874513771 +0000 UTC m=+2130.722557887" watchObservedRunningTime="2026-04-24 22:02:09.876066898 +0000 UTC m=+2130.724111015" Apr 24 22:02:09.879309 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.879278 2571 scope.go:117] "RemoveContainer" containerID="764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf" Apr 24 22:02:09.879619 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:02:09.879601 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf\": container with ID starting with 764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf not found: ID does not exist" containerID="764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf" Apr 24 22:02:09.879693 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.879626 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf"} err="failed to get container status \"764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf\": rpc error: code = NotFound desc = could not find container \"764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf\": container with ID starting with 764580f34bb8cf428572f6977ffa683cb14b67c6cc7b95b56bad23dd009caeaf not found: ID does not exist" Apr 24 22:02:09.879693 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.879642 2571 scope.go:117] "RemoveContainer" containerID="c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a" Apr 24 22:02:09.879880 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:02:09.879861 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a\": container with ID starting with c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a not found: ID does not exist" containerID="c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a" Apr 24 22:02:09.879924 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.879883 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a"} err="failed to get container status \"c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a\": rpc error: code = NotFound desc = could not find container \"c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a\": container with ID starting with c3ff56faacd60985952cb377d49867ca47a64b1ea4cc033e1f18b7d580f46e4a not found: ID does not exist" Apr 24 22:02:09.879924 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.879897 2571 scope.go:117] "RemoveContainer" containerID="d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff" Apr 24 22:02:09.880092 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:02:09.880078 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff\": container with ID starting with d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff not found: ID does not exist" containerID="d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff" Apr 24 22:02:09.880130 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.880097 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff"} err="failed to get container status \"d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff\": rpc error: code = NotFound desc = could not find container \"d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff\": container with ID starting with d2902275fa38c73c01e913c6c8daf9c2cfee71caed5410895e4b55e1881065ff not found: ID does not exist" Apr 24 22:02:09.887502 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.887479 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6"] Apr 24 22:02:09.889745 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:09.889706 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-tbpm6"] Apr 24 22:02:10.860544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:10.860504 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:02:11.647160 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:11.647128 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" path="/var/lib/kubelet/pods/fc3a24a9-55b1-4279-aa96-2f627a013eb2/volumes" Apr 24 22:02:15.865956 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:15.865922 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:02:15.866592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:15.866564 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:02:25.866994 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:25.866954 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:02:35.866918 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:35.866828 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:02:45.866752 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:45.866712 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:02:55.866778 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:02:55.866733 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:03:05.866414 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:05.866371 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:03:15.867203 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:15.867166 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:03:25.867967 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:25.867935 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:03:34.667601 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.667566 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl"] Apr 24 22:03:34.667989 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.667908 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" containerID="cri-o://31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7" gracePeriod=30 Apr 24 22:03:34.668062 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.667970 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kube-rbac-proxy" containerID="cri-o://f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df" gracePeriod=30 Apr 24 22:03:34.782534 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782502 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj"] Apr 24 22:03:34.782843 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782831 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="storage-initializer" Apr 24 22:03:34.782887 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782846 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="storage-initializer" Apr 24 22:03:34.782887 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782854 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kube-rbac-proxy" Apr 24 22:03:34.782887 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782860 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kube-rbac-proxy" Apr 24 22:03:34.782887 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782870 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" Apr 24 22:03:34.782887 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782876 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" Apr 24 22:03:34.783046 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782921 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kserve-container" Apr 24 22:03:34.783046 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.782929 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc3a24a9-55b1-4279-aa96-2f627a013eb2" containerName="kube-rbac-proxy" Apr 24 22:03:34.786143 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.786119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:34.788755 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.788716 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:03:34.788876 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.788856 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 24 22:03:34.796316 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.796269 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj"] Apr 24 22:03:34.902510 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.902477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/940994d0-c144-4b66-bb5f-f1277f38400e-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:34.902724 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.902520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vtcv\" (UniqueName: \"kubernetes.io/projected/940994d0-c144-4b66-bb5f-f1277f38400e-kube-api-access-4vtcv\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:34.902724 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.902590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:34.902724 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:34.902668 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/940994d0-c144-4b66-bb5f-f1277f38400e-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.003769 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.003730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/940994d0-c144-4b66-bb5f-f1277f38400e-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.003969 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.003786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vtcv\" (UniqueName: \"kubernetes.io/projected/940994d0-c144-4b66-bb5f-f1277f38400e-kube-api-access-4vtcv\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.003969 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.003823 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.003969 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.003880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/940994d0-c144-4b66-bb5f-f1277f38400e-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.004145 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:03:35.004002 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-serving-cert: secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 24 22:03:35.004145 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:03:35.004084 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls podName:940994d0-c144-4b66-bb5f-f1277f38400e nodeName:}" failed. No retries permitted until 2026-04-24 22:03:35.504060625 +0000 UTC m=+2216.352104722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls") pod "isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" (UID: "940994d0-c144-4b66-bb5f-f1277f38400e") : secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 24 22:03:35.004291 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.004268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/940994d0-c144-4b66-bb5f-f1277f38400e-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.004445 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.004426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/940994d0-c144-4b66-bb5f-f1277f38400e-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.013232 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.013209 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vtcv\" (UniqueName: \"kubernetes.io/projected/940994d0-c144-4b66-bb5f-f1277f38400e-kube-api-access-4vtcv\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.086624 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.086589 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerID="f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df" exitCode=2 Apr 24 22:03:35.086790 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.086655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerDied","Data":"f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df"} Apr 24 22:03:35.508190 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.508161 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.510864 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.510840 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.696721 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.696693 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:35.819128 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.819096 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj"] Apr 24 22:03:35.821998 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:03:35.821971 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940994d0_c144_4b66_bb5f_f1277f38400e.slice/crio-2fa948ddeb9d8460d30d27475b04c34b05919357a3bb43b5ef545a147ec3a532 WatchSource:0}: Error finding container 2fa948ddeb9d8460d30d27475b04c34b05919357a3bb43b5ef545a147ec3a532: Status 404 returned error can't find the container with id 2fa948ddeb9d8460d30d27475b04c34b05919357a3bb43b5ef545a147ec3a532 Apr 24 22:03:35.860798 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.860761 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 22:03:35.867109 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:35.867080 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 22:03:36.095848 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:36.095748 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerStarted","Data":"b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300"} Apr 24 22:03:36.095848 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:36.095802 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerStarted","Data":"2fa948ddeb9d8460d30d27475b04c34b05919357a3bb43b5ef545a147ec3a532"} Apr 24 22:03:40.107537 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.107502 2571 generic.go:358] "Generic (PLEG): container finished" podID="940994d0-c144-4b66-bb5f-f1277f38400e" containerID="b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300" exitCode=0 Apr 24 22:03:40.107977 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.107551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerDied","Data":"b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300"} Apr 24 22:03:40.800572 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.800550 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:03:40.957604 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.957522 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce611f4e-61d2-4f28-a350-e7c41563bc82-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"ce611f4e-61d2-4f28-a350-e7c41563bc82\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " Apr 24 22:03:40.957604 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.957561 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfbng\" (UniqueName: \"kubernetes.io/projected/ce611f4e-61d2-4f28-a350-e7c41563bc82-kube-api-access-bfbng\") pod \"ce611f4e-61d2-4f28-a350-e7c41563bc82\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " Apr 24 22:03:40.957604 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.957581 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce611f4e-61d2-4f28-a350-e7c41563bc82-kserve-provision-location\") pod \"ce611f4e-61d2-4f28-a350-e7c41563bc82\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " Apr 24 22:03:40.957897 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.957608 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce611f4e-61d2-4f28-a350-e7c41563bc82-proxy-tls\") pod \"ce611f4e-61d2-4f28-a350-e7c41563bc82\" (UID: \"ce611f4e-61d2-4f28-a350-e7c41563bc82\") " Apr 24 22:03:40.957897 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.957877 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce611f4e-61d2-4f28-a350-e7c41563bc82-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "ce611f4e-61d2-4f28-a350-e7c41563bc82" (UID: "ce611f4e-61d2-4f28-a350-e7c41563bc82"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:03:40.958004 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.957947 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce611f4e-61d2-4f28-a350-e7c41563bc82-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce611f4e-61d2-4f28-a350-e7c41563bc82" (UID: "ce611f4e-61d2-4f28-a350-e7c41563bc82"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:03:40.959819 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.959792 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce611f4e-61d2-4f28-a350-e7c41563bc82-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ce611f4e-61d2-4f28-a350-e7c41563bc82" (UID: "ce611f4e-61d2-4f28-a350-e7c41563bc82"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:03:40.959819 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:40.959802 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce611f4e-61d2-4f28-a350-e7c41563bc82-kube-api-access-bfbng" (OuterVolumeSpecName: "kube-api-access-bfbng") pod "ce611f4e-61d2-4f28-a350-e7c41563bc82" (UID: "ce611f4e-61d2-4f28-a350-e7c41563bc82"). InnerVolumeSpecName "kube-api-access-bfbng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:03:41.058592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.058560 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bfbng\" (UniqueName: \"kubernetes.io/projected/ce611f4e-61d2-4f28-a350-e7c41563bc82-kube-api-access-bfbng\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:03:41.058592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.058587 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce611f4e-61d2-4f28-a350-e7c41563bc82-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:03:41.058592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.058598 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce611f4e-61d2-4f28-a350-e7c41563bc82-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:03:41.058794 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.058609 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce611f4e-61d2-4f28-a350-e7c41563bc82-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:03:41.112516 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.112483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerStarted","Data":"e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a"} Apr 24 22:03:41.112920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.112523 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerStarted","Data":"bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0"} Apr 24 22:03:41.112920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.112780 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:41.112920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.112883 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:03:41.114158 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.114134 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerID="31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7" exitCode=0 Apr 24 22:03:41.114276 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.114178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerDied","Data":"31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7"} Apr 24 22:03:41.114276 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.114201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" event={"ID":"ce611f4e-61d2-4f28-a350-e7c41563bc82","Type":"ContainerDied","Data":"3270aec8576a82a0ef97a145dd205a484163421a835642079ccc918c207382b3"} Apr 24 22:03:41.114276 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.114221 2571 scope.go:117] "RemoveContainer" containerID="f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df" Apr 24 22:03:41.114276 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.114224 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl" Apr 24 22:03:41.122466 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.122447 2571 scope.go:117] "RemoveContainer" containerID="31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7" Apr 24 22:03:41.129803 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.129786 2571 scope.go:117] "RemoveContainer" containerID="744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17" Apr 24 22:03:41.136526 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.136507 2571 scope.go:117] "RemoveContainer" containerID="f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df" Apr 24 22:03:41.136784 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:03:41.136742 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df\": container with ID starting with f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df not found: ID does not exist" containerID="f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df" Apr 24 22:03:41.136831 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.136792 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df"} err="failed to get container status \"f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df\": rpc error: code = NotFound desc = could not find container \"f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df\": container with ID starting with f7c3db8bdf7d2852f7eb4baa26e80b50465815905386e661cc8dcaa85cb048df not found: ID does not exist" Apr 24 22:03:41.136831 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.136809 2571 scope.go:117] "RemoveContainer" containerID="31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7" Apr 24 22:03:41.137031 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:03:41.137014 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7\": container with ID starting with 31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7 not found: ID does not exist" containerID="31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7" Apr 24 22:03:41.137074 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.137038 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7"} err="failed to get container status \"31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7\": rpc error: code = NotFound desc = could not find container \"31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7\": container with ID starting with 31ce9039c3b9125a0fb780504f7640761f23357a614f6eb9d2641f1a4bdc78b7 not found: ID does not exist" Apr 24 22:03:41.137074 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.137054 2571 scope.go:117] "RemoveContainer" containerID="744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17" Apr 24 22:03:41.137279 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:03:41.137262 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17\": container with ID starting with 744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17 not found: ID does not exist" containerID="744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17" Apr 24 22:03:41.137448 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.137285 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17"} err="failed to get container status \"744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17\": rpc error: code = NotFound desc = could not find container \"744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17\": container with ID starting with 744db9e616b7931f5a5104a69a2b4ad3cfe5b678c98503e0f17d9079a49f4d17 not found: ID does not exist" Apr 24 22:03:41.141467 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.141430 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podStartSLOduration=7.141419368 podStartE2EDuration="7.141419368s" podCreationTimestamp="2026-04-24 22:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:03:41.14112499 +0000 UTC m=+2221.989169120" watchObservedRunningTime="2026-04-24 22:03:41.141419368 +0000 UTC m=+2221.989463484" Apr 24 22:03:41.165454 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.165435 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl"] Apr 24 22:03:41.168266 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.168248 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-92wfl"] Apr 24 22:03:41.647761 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:41.647727 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" path="/var/lib/kubelet/pods/ce611f4e-61d2-4f28-a350-e7c41563bc82/volumes" Apr 24 22:03:47.123700 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:03:47.123668 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:04:17.124888 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:04:17.124847 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 22:04:27.124620 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:04:27.124582 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 22:04:37.125371 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:04:37.125317 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 22:04:47.124358 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:04:47.124284 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 22:04:57.127947 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:04:57.127915 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:05:04.922458 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:04.922419 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj"] Apr 24 22:05:04.922972 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:04.922831 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" containerID="cri-o://bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0" gracePeriod=30 Apr 24 22:05:04.922972 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:04.922880 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kube-rbac-proxy" containerID="cri-o://e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a" gracePeriod=30 Apr 24 22:05:05.030478 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030447 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85"] Apr 24 22:05:05.030861 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030844 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kube-rbac-proxy" Apr 24 22:05:05.030921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030868 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kube-rbac-proxy" Apr 24 22:05:05.030921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030881 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" Apr 24 22:05:05.030921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030889 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" Apr 24 22:05:05.030921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030902 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="storage-initializer" Apr 24 22:05:05.030921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030908 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="storage-initializer" Apr 24 22:05:05.031096 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030964 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kube-rbac-proxy" Apr 24 22:05:05.031096 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.030980 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce611f4e-61d2-4f28-a350-e7c41563bc82" containerName="kserve-container" Apr 24 22:05:05.036643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.036613 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.039078 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.038999 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:05:05.039078 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.039029 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 24 22:05:05.042927 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.042900 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85"] Apr 24 22:05:05.221173 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.221076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5477d971-6d40-4a74-928c-fb60b4949513-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.221173 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.221143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5477d971-6d40-4a74-928c-fb60b4949513-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.221394 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.221196 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7cr\" (UniqueName: \"kubernetes.io/projected/5477d971-6d40-4a74-928c-fb60b4949513-kube-api-access-wj7cr\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.221394 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.221218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5477d971-6d40-4a74-928c-fb60b4949513-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.321559 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.321529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7cr\" (UniqueName: \"kubernetes.io/projected/5477d971-6d40-4a74-928c-fb60b4949513-kube-api-access-wj7cr\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.321762 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.321566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5477d971-6d40-4a74-928c-fb60b4949513-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.321762 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.321713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5477d971-6d40-4a74-928c-fb60b4949513-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.321945 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.321809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5477d971-6d40-4a74-928c-fb60b4949513-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.321945 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.321898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5477d971-6d40-4a74-928c-fb60b4949513-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.322428 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.322405 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5477d971-6d40-4a74-928c-fb60b4949513-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.324385 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.324367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5477d971-6d40-4a74-928c-fb60b4949513-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.330180 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.330152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj7cr\" (UniqueName: \"kubernetes.io/projected/5477d971-6d40-4a74-928c-fb60b4949513-kube-api-access-wj7cr\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.344969 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.344940 2571 generic.go:358] "Generic (PLEG): container finished" podID="940994d0-c144-4b66-bb5f-f1277f38400e" containerID="e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a" exitCode=2 Apr 24 22:05:05.345115 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.344977 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerDied","Data":"e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a"} Apr 24 22:05:05.348158 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.348142 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:05.470190 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:05.470157 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85"] Apr 24 22:05:05.473084 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:05:05.473013 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5477d971_6d40_4a74_928c_fb60b4949513.slice/crio-095a3673533c7589b023ea1e774e5e43999e6a1eb29604a2ee8d91a0c1af43d1 WatchSource:0}: Error finding container 095a3673533c7589b023ea1e774e5e43999e6a1eb29604a2ee8d91a0c1af43d1: Status 404 returned error can't find the container with id 095a3673533c7589b023ea1e774e5e43999e6a1eb29604a2ee8d91a0c1af43d1 Apr 24 22:05:06.349339 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:06.349291 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerStarted","Data":"23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16"} Apr 24 22:05:06.349339 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:06.349340 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerStarted","Data":"095a3673533c7589b023ea1e774e5e43999e6a1eb29604a2ee8d91a0c1af43d1"} Apr 24 22:05:07.119424 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:07.119384 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 24 22:05:07.124880 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:07.124843 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 22:05:09.358838 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.358801 2571 generic.go:358] "Generic (PLEG): container finished" podID="5477d971-6d40-4a74-928c-fb60b4949513" containerID="23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16" exitCode=0 Apr 24 22:05:09.359233 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.358874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerDied","Data":"23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16"} Apr 24 22:05:09.661373 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.661349 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:05:09.751520 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.751484 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/940994d0-c144-4b66-bb5f-f1277f38400e-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"940994d0-c144-4b66-bb5f-f1277f38400e\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " Apr 24 22:05:09.751695 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.751545 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vtcv\" (UniqueName: \"kubernetes.io/projected/940994d0-c144-4b66-bb5f-f1277f38400e-kube-api-access-4vtcv\") pod \"940994d0-c144-4b66-bb5f-f1277f38400e\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " Apr 24 22:05:09.751695 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.751600 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls\") pod \"940994d0-c144-4b66-bb5f-f1277f38400e\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " Apr 24 22:05:09.751889 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.751870 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940994d0-c144-4b66-bb5f-f1277f38400e-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "940994d0-c144-4b66-bb5f-f1277f38400e" (UID: "940994d0-c144-4b66-bb5f-f1277f38400e"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:05:09.753691 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.753665 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940994d0-c144-4b66-bb5f-f1277f38400e-kube-api-access-4vtcv" (OuterVolumeSpecName: "kube-api-access-4vtcv") pod "940994d0-c144-4b66-bb5f-f1277f38400e" (UID: "940994d0-c144-4b66-bb5f-f1277f38400e"). InnerVolumeSpecName "kube-api-access-4vtcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:09.753762 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.753717 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "940994d0-c144-4b66-bb5f-f1277f38400e" (UID: "940994d0-c144-4b66-bb5f-f1277f38400e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:09.852454 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.852415 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/940994d0-c144-4b66-bb5f-f1277f38400e-kserve-provision-location\") pod \"940994d0-c144-4b66-bb5f-f1277f38400e\" (UID: \"940994d0-c144-4b66-bb5f-f1277f38400e\") " Apr 24 22:05:09.852654 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.852633 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/940994d0-c144-4b66-bb5f-f1277f38400e-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:05:09.852729 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.852659 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vtcv\" (UniqueName: \"kubernetes.io/projected/940994d0-c144-4b66-bb5f-f1277f38400e-kube-api-access-4vtcv\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:05:09.852729 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.852673 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/940994d0-c144-4b66-bb5f-f1277f38400e-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:05:09.852729 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.852682 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940994d0-c144-4b66-bb5f-f1277f38400e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "940994d0-c144-4b66-bb5f-f1277f38400e" (UID: "940994d0-c144-4b66-bb5f-f1277f38400e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:09.953678 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:09.953586 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/940994d0-c144-4b66-bb5f-f1277f38400e-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:05:10.362903 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.362864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerStarted","Data":"65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125"} Apr 24 22:05:10.363372 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.362916 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerStarted","Data":"2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67"} Apr 24 22:05:10.363372 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.363145 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:10.364421 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.364398 2571 generic.go:358] "Generic (PLEG): container finished" podID="940994d0-c144-4b66-bb5f-f1277f38400e" containerID="bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0" exitCode=0 Apr 24 22:05:10.364536 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.364463 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerDied","Data":"bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0"} Apr 24 22:05:10.364536 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.364481 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" event={"ID":"940994d0-c144-4b66-bb5f-f1277f38400e","Type":"ContainerDied","Data":"2fa948ddeb9d8460d30d27475b04c34b05919357a3bb43b5ef545a147ec3a532"} Apr 24 22:05:10.364536 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.364481 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj" Apr 24 22:05:10.364536 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.364494 2571 scope.go:117] "RemoveContainer" containerID="e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a" Apr 24 22:05:10.372325 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.372138 2571 scope.go:117] "RemoveContainer" containerID="bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0" Apr 24 22:05:10.379161 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.379141 2571 scope.go:117] "RemoveContainer" containerID="b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300" Apr 24 22:05:10.386670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.386649 2571 scope.go:117] "RemoveContainer" containerID="e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a" Apr 24 22:05:10.386761 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.386649 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podStartSLOduration=5.386637046 podStartE2EDuration="5.386637046s" podCreationTimestamp="2026-04-24 22:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:05:10.384192452 +0000 UTC m=+2311.232236568" watchObservedRunningTime="2026-04-24 22:05:10.386637046 +0000 UTC m=+2311.234681161" Apr 24 22:05:10.386907 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:05:10.386890 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a\": container with ID starting with e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a not found: ID does not exist" containerID="e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a" Apr 24 22:05:10.386955 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.386915 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a"} err="failed to get container status \"e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a\": rpc error: code = NotFound desc = could not find container \"e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a\": container with ID starting with e3964aa80ebe16b105d2426a97a10ffb1f9ed988e1fd9e6a33dc6f6b9f867e7a not found: ID does not exist" Apr 24 22:05:10.386955 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.386933 2571 scope.go:117] "RemoveContainer" containerID="bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0" Apr 24 22:05:10.387227 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:05:10.387208 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0\": container with ID starting with bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0 not found: ID does not exist" containerID="bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0" Apr 24 22:05:10.387311 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.387238 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0"} err="failed to get container status \"bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0\": rpc error: code = NotFound desc = could not find container \"bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0\": container with ID starting with bc496fcd5e7078cd3da699f853b9081d84b75ac37fc4544eebb21759b2dbd2f0 not found: ID does not exist" Apr 24 22:05:10.387311 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.387263 2571 scope.go:117] "RemoveContainer" containerID="b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300" Apr 24 22:05:10.387555 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:05:10.387534 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300\": container with ID starting with b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300 not found: ID does not exist" containerID="b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300" Apr 24 22:05:10.387633 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.387558 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300"} err="failed to get container status \"b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300\": rpc error: code = NotFound desc = could not find container \"b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300\": container with ID starting with b1f8329534eb73c31f9603cfab89f35d23afaa206135afa11f3e4df1acb2f300 not found: ID does not exist" Apr 24 22:05:10.397205 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.397180 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj"] Apr 24 22:05:10.400817 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:10.400793 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-dp7nj"] Apr 24 22:05:11.368345 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:11.368318 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:11.647223 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:11.647131 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" path="/var/lib/kubelet/pods/940994d0-c144-4b66-bb5f-f1277f38400e/volumes" Apr 24 22:05:17.376236 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:17.376207 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:05:47.376848 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:47.376809 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 22:05:57.376893 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:05:57.376847 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 22:06:07.377038 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:07.376992 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 22:06:17.377085 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:17.377047 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 22:06:27.380835 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:27.380800 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:06:35.156536 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.156507 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85"] Apr 24 22:06:35.156924 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.156838 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" containerID="cri-o://2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67" gracePeriod=30 Apr 24 22:06:35.156997 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.156889 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kube-rbac-proxy" containerID="cri-o://65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125" gracePeriod=30 Apr 24 22:06:35.277466 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277438 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj"] Apr 24 22:06:35.277716 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277705 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kube-rbac-proxy" Apr 24 22:06:35.277759 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277718 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kube-rbac-proxy" Apr 24 22:06:35.277759 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277728 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" Apr 24 22:06:35.277759 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277734 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" Apr 24 22:06:35.277759 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277745 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="storage-initializer" Apr 24 22:06:35.277759 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277750 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="storage-initializer" Apr 24 22:06:35.277910 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277809 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kserve-container" Apr 24 22:06:35.277910 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.277817 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="940994d0-c144-4b66-bb5f-f1277f38400e" containerName="kube-rbac-proxy" Apr 24 22:06:35.280592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.280577 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.283101 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.283079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 24 22:06:35.283201 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.283085 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:06:35.292077 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.292057 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj"] Apr 24 22:06:35.319419 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.319388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhd59\" (UniqueName: \"kubernetes.io/projected/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kube-api-access-vhd59\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.319540 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.319439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.319540 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.319499 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f86845d-a7a3-4031-b938-05e8f95d6ec0-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.319651 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.319565 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f86845d-a7a3-4031-b938-05e8f95d6ec0-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.420631 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.420563 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f86845d-a7a3-4031-b938-05e8f95d6ec0-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.420768 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.420643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f86845d-a7a3-4031-b938-05e8f95d6ec0-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.420768 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.420703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhd59\" (UniqueName: \"kubernetes.io/projected/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kube-api-access-vhd59\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.420768 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.420746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.421159 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.421137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.421404 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.421382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f86845d-a7a3-4031-b938-05e8f95d6ec0-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.423191 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.423172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f86845d-a7a3-4031-b938-05e8f95d6ec0-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.429336 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.429317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhd59\" (UniqueName: \"kubernetes.io/projected/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kube-api-access-vhd59\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.590134 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.590100 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:35.603574 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.603546 2571 generic.go:358] "Generic (PLEG): container finished" podID="5477d971-6d40-4a74-928c-fb60b4949513" containerID="65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125" exitCode=2 Apr 24 22:06:35.603726 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.603621 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerDied","Data":"65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125"} Apr 24 22:06:35.711496 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:35.711388 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj"] Apr 24 22:06:35.713998 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:06:35.713970 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f86845d_a7a3_4031_b938_05e8f95d6ec0.slice/crio-f86367f1160f5448f1db1000dd5d278213d932a0d95adb7302796fbb24f4a2d6 WatchSource:0}: Error finding container f86367f1160f5448f1db1000dd5d278213d932a0d95adb7302796fbb24f4a2d6: Status 404 returned error can't find the container with id f86367f1160f5448f1db1000dd5d278213d932a0d95adb7302796fbb24f4a2d6 Apr 24 22:06:36.607556 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:36.607518 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerStarted","Data":"126565e890491d53c706c0229aaa0a5a5f802f710123c1df754be74c2c97c106"} Apr 24 22:06:36.607556 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:36.607559 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerStarted","Data":"f86367f1160f5448f1db1000dd5d278213d932a0d95adb7302796fbb24f4a2d6"} Apr 24 22:06:37.371445 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:37.371398 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 24 22:06:37.376878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:37.376846 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 22:06:39.617132 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:39.617052 2571 generic.go:358] "Generic (PLEG): container finished" podID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerID="126565e890491d53c706c0229aaa0a5a5f802f710123c1df754be74c2c97c106" exitCode=0 Apr 24 22:06:39.617132 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:39.617116 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerDied","Data":"126565e890491d53c706c0229aaa0a5a5f802f710123c1df754be74c2c97c106"} Apr 24 22:06:40.190398 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.190370 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:06:40.259574 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.259535 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5477d971-6d40-4a74-928c-fb60b4949513-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"5477d971-6d40-4a74-928c-fb60b4949513\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " Apr 24 22:06:40.259732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.259583 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5477d971-6d40-4a74-928c-fb60b4949513-proxy-tls\") pod \"5477d971-6d40-4a74-928c-fb60b4949513\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " Apr 24 22:06:40.259732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.259603 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5477d971-6d40-4a74-928c-fb60b4949513-kserve-provision-location\") pod \"5477d971-6d40-4a74-928c-fb60b4949513\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " Apr 24 22:06:40.259732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.259643 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj7cr\" (UniqueName: \"kubernetes.io/projected/5477d971-6d40-4a74-928c-fb60b4949513-kube-api-access-wj7cr\") pod \"5477d971-6d40-4a74-928c-fb60b4949513\" (UID: \"5477d971-6d40-4a74-928c-fb60b4949513\") " Apr 24 22:06:40.259953 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.259928 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5477d971-6d40-4a74-928c-fb60b4949513-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "5477d971-6d40-4a74-928c-fb60b4949513" (UID: "5477d971-6d40-4a74-928c-fb60b4949513"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:06:40.259953 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.259940 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5477d971-6d40-4a74-928c-fb60b4949513-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5477d971-6d40-4a74-928c-fb60b4949513" (UID: "5477d971-6d40-4a74-928c-fb60b4949513"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:06:40.261792 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.261767 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5477d971-6d40-4a74-928c-fb60b4949513-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5477d971-6d40-4a74-928c-fb60b4949513" (UID: "5477d971-6d40-4a74-928c-fb60b4949513"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:06:40.261882 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.261796 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5477d971-6d40-4a74-928c-fb60b4949513-kube-api-access-wj7cr" (OuterVolumeSpecName: "kube-api-access-wj7cr") pod "5477d971-6d40-4a74-928c-fb60b4949513" (UID: "5477d971-6d40-4a74-928c-fb60b4949513"). InnerVolumeSpecName "kube-api-access-wj7cr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:06:40.360766 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.360667 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj7cr\" (UniqueName: \"kubernetes.io/projected/5477d971-6d40-4a74-928c-fb60b4949513-kube-api-access-wj7cr\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:06:40.360766 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.360700 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5477d971-6d40-4a74-928c-fb60b4949513-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:06:40.360766 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.360714 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5477d971-6d40-4a74-928c-fb60b4949513-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:06:40.360766 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.360729 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5477d971-6d40-4a74-928c-fb60b4949513-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:06:40.621947 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.621848 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerStarted","Data":"3b26b7ee962bf2d8c108e0c4a15dd434329c83e0dd18f1f210e9c0d2166fb1a8"} Apr 24 22:06:40.621947 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.621893 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerStarted","Data":"b5adab34e6dbd84f1817f3a50eea9a13bd6841bfa43c4f4d8323184f705539ad"} Apr 24 22:06:40.622479 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.622134 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:40.622479 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.622197 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:06:40.623439 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.623416 2571 generic.go:358] "Generic (PLEG): container finished" podID="5477d971-6d40-4a74-928c-fb60b4949513" containerID="2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67" exitCode=0 Apr 24 22:06:40.623544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.623462 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerDied","Data":"2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67"} Apr 24 22:06:40.623544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.623475 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" Apr 24 22:06:40.623544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.623489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85" event={"ID":"5477d971-6d40-4a74-928c-fb60b4949513","Type":"ContainerDied","Data":"095a3673533c7589b023ea1e774e5e43999e6a1eb29604a2ee8d91a0c1af43d1"} Apr 24 22:06:40.623544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.623506 2571 scope.go:117] "RemoveContainer" containerID="65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125" Apr 24 22:06:40.631873 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.631857 2571 scope.go:117] "RemoveContainer" containerID="2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67" Apr 24 22:06:40.638872 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.638854 2571 scope.go:117] "RemoveContainer" containerID="23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16" Apr 24 22:06:40.644109 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.644063 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podStartSLOduration=5.6440467210000005 podStartE2EDuration="5.644046721s" podCreationTimestamp="2026-04-24 22:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:06:40.642846109 +0000 UTC m=+2401.490890260" watchObservedRunningTime="2026-04-24 22:06:40.644046721 +0000 UTC m=+2401.492090840" Apr 24 22:06:40.646846 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.646828 2571 scope.go:117] "RemoveContainer" containerID="65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125" Apr 24 22:06:40.647097 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:06:40.647081 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125\": container with ID starting with 65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125 not found: ID does not exist" containerID="65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125" Apr 24 22:06:40.647166 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.647105 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125"} err="failed to get container status \"65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125\": rpc error: code = NotFound desc = could not find container \"65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125\": container with ID starting with 65e5502679d7213ef23d46ab7a76d9c1ea67cfa0a52fe5bebb99528921be9125 not found: ID does not exist" Apr 24 22:06:40.647166 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.647120 2571 scope.go:117] "RemoveContainer" containerID="2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67" Apr 24 22:06:40.647369 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:06:40.647347 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67\": container with ID starting with 2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67 not found: ID does not exist" containerID="2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67" Apr 24 22:06:40.647479 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.647371 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67"} err="failed to get container status \"2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67\": rpc error: code = NotFound desc = could not find container \"2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67\": container with ID starting with 2bf544494b11583faf6f84e0d8a3bd29310b76d74330f7db138f616c00b18f67 not found: ID does not exist" Apr 24 22:06:40.647479 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.647388 2571 scope.go:117] "RemoveContainer" containerID="23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16" Apr 24 22:06:40.647626 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:06:40.647606 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16\": container with ID starting with 23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16 not found: ID does not exist" containerID="23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16" Apr 24 22:06:40.647682 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.647633 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16"} err="failed to get container status \"23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16\": rpc error: code = NotFound desc = could not find container \"23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16\": container with ID starting with 23fb3c5517bf0da0e3bef20395207dce9c96a75e81f288c3b57039b3f4c02b16 not found: ID does not exist" Apr 24 22:06:40.655900 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.655875 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85"] Apr 24 22:06:40.661562 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:40.661539 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-t5k85"] Apr 24 22:06:41.647316 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:41.647278 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5477d971-6d40-4a74-928c-fb60b4949513" path="/var/lib/kubelet/pods/5477d971-6d40-4a74-928c-fb60b4949513/volumes" Apr 24 22:06:46.633286 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:06:46.633253 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:07:16.634049 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:07:16.634001 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 22:07:26.634901 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:07:26.634860 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 22:07:36.634118 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:07:36.634070 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 22:07:46.633873 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:07:46.633831 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 22:07:56.637203 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:07:56.637170 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:08:05.399689 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:05.399655 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj"] Apr 24 22:08:05.400125 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:05.400080 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" containerID="cri-o://b5adab34e6dbd84f1817f3a50eea9a13bd6841bfa43c4f4d8323184f705539ad" gracePeriod=30 Apr 24 22:08:05.400125 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:05.400095 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kube-rbac-proxy" containerID="cri-o://3b26b7ee962bf2d8c108e0c4a15dd434329c83e0dd18f1f210e9c0d2166fb1a8" gracePeriod=30 Apr 24 22:08:05.861096 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:05.861061 2571 generic.go:358] "Generic (PLEG): container finished" podID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerID="3b26b7ee962bf2d8c108e0c4a15dd434329c83e0dd18f1f210e9c0d2166fb1a8" exitCode=2 Apr 24 22:08:05.861269 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:05.861104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerDied","Data":"3b26b7ee962bf2d8c108e0c4a15dd434329c83e0dd18f1f210e9c0d2166fb1a8"} Apr 24 22:08:06.628046 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:06.628003 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 24 22:08:06.634468 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:06.634432 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 22:08:07.647202 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647171 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb"] Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647454 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kube-rbac-proxy" Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647465 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kube-rbac-proxy" Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647481 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647487 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647493 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="storage-initializer" Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647498 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="storage-initializer" Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647541 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kserve-container" Apr 24 22:08:07.647643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.647550 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5477d971-6d40-4a74-928c-fb60b4949513" containerName="kube-rbac-proxy" Apr 24 22:08:07.650513 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.650498 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.652962 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.652941 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 24 22:08:07.653144 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.653132 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 22:08:07.658206 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.658184 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb"] Apr 24 22:08:07.750745 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.750700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194af666-7fcc-4d84-93d0-d8efdfda22f0-kserve-provision-location\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.750940 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.750753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.750940 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.750895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/194af666-7fcc-4d84-93d0-d8efdfda22f0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.750940 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.750929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqvt\" (UniqueName: \"kubernetes.io/projected/194af666-7fcc-4d84-93d0-d8efdfda22f0-kube-api-access-5sqvt\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.851474 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.851435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/194af666-7fcc-4d84-93d0-d8efdfda22f0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.851474 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.851475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqvt\" (UniqueName: \"kubernetes.io/projected/194af666-7fcc-4d84-93d0-d8efdfda22f0-kube-api-access-5sqvt\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.851747 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.851526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194af666-7fcc-4d84-93d0-d8efdfda22f0-kserve-provision-location\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.851747 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.851552 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.851747 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:08:07.851681 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-predictor-serving-cert: secret "isvc-sklearn-predictor-serving-cert" not found Apr 24 22:08:07.851928 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:08:07.851751 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls podName:194af666-7fcc-4d84-93d0-d8efdfda22f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:08:08.351727908 +0000 UTC m=+2489.199772005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls") pod "isvc-sklearn-predictor-77f5c96b44-5d9wb" (UID: "194af666-7fcc-4d84-93d0-d8efdfda22f0") : secret "isvc-sklearn-predictor-serving-cert" not found Apr 24 22:08:07.851928 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.851910 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194af666-7fcc-4d84-93d0-d8efdfda22f0-kserve-provision-location\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.852132 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.852113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/194af666-7fcc-4d84-93d0-d8efdfda22f0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:07.860160 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:07.860137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqvt\" (UniqueName: \"kubernetes.io/projected/194af666-7fcc-4d84-93d0-d8efdfda22f0-kube-api-access-5sqvt\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:08.355687 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:08.355644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:08.358237 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:08.358206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls\") pod \"isvc-sklearn-predictor-77f5c96b44-5d9wb\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:08.561591 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:08.561554 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:08.680873 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:08.680842 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb"] Apr 24 22:08:08.683494 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:08:08.683453 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194af666_7fcc_4d84_93d0_d8efdfda22f0.slice/crio-7d2c85389345f9907635301c01e1a7c284814a2104ee3687e8f57e4ee9987b7b WatchSource:0}: Error finding container 7d2c85389345f9907635301c01e1a7c284814a2104ee3687e8f57e4ee9987b7b: Status 404 returned error can't find the container with id 7d2c85389345f9907635301c01e1a7c284814a2104ee3687e8f57e4ee9987b7b Apr 24 22:08:08.685392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:08.685362 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:08:08.870487 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:08.870411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerStarted","Data":"5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8"} Apr 24 22:08:08.870487 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:08.870445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerStarted","Data":"7d2c85389345f9907635301c01e1a7c284814a2104ee3687e8f57e4ee9987b7b"} Apr 24 22:08:10.877104 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:10.877072 2571 generic.go:358] "Generic (PLEG): container finished" podID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerID="b5adab34e6dbd84f1817f3a50eea9a13bd6841bfa43c4f4d8323184f705539ad" exitCode=0 Apr 24 22:08:10.877487 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:10.877147 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerDied","Data":"b5adab34e6dbd84f1817f3a50eea9a13bd6841bfa43c4f4d8323184f705539ad"} Apr 24 22:08:11.331168 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.331144 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:08:11.482632 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.482537 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kserve-provision-location\") pod \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " Apr 24 22:08:11.482632 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.482580 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f86845d-a7a3-4031-b938-05e8f95d6ec0-proxy-tls\") pod \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " Apr 24 22:08:11.482632 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.482623 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhd59\" (UniqueName: \"kubernetes.io/projected/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kube-api-access-vhd59\") pod \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " Apr 24 22:08:11.482878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.482662 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f86845d-a7a3-4031-b938-05e8f95d6ec0-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\" (UID: \"1f86845d-a7a3-4031-b938-05e8f95d6ec0\") " Apr 24 22:08:11.482920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.482881 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1f86845d-a7a3-4031-b938-05e8f95d6ec0" (UID: "1f86845d-a7a3-4031-b938-05e8f95d6ec0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:11.483093 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.483070 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f86845d-a7a3-4031-b938-05e8f95d6ec0-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "1f86845d-a7a3-4031-b938-05e8f95d6ec0" (UID: "1f86845d-a7a3-4031-b938-05e8f95d6ec0"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:08:11.484933 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.484906 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f86845d-a7a3-4031-b938-05e8f95d6ec0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1f86845d-a7a3-4031-b938-05e8f95d6ec0" (UID: "1f86845d-a7a3-4031-b938-05e8f95d6ec0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:08:11.484933 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.484922 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kube-api-access-vhd59" (OuterVolumeSpecName: "kube-api-access-vhd59") pod "1f86845d-a7a3-4031-b938-05e8f95d6ec0" (UID: "1f86845d-a7a3-4031-b938-05e8f95d6ec0"). InnerVolumeSpecName "kube-api-access-vhd59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:08:11.584235 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.584117 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f86845d-a7a3-4031-b938-05e8f95d6ec0-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:08:11.584235 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.584170 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:08:11.584235 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.584191 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f86845d-a7a3-4031-b938-05e8f95d6ec0-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:08:11.584235 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.584208 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vhd59\" (UniqueName: \"kubernetes.io/projected/1f86845d-a7a3-4031-b938-05e8f95d6ec0-kube-api-access-vhd59\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:08:11.882313 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.882263 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" event={"ID":"1f86845d-a7a3-4031-b938-05e8f95d6ec0","Type":"ContainerDied","Data":"f86367f1160f5448f1db1000dd5d278213d932a0d95adb7302796fbb24f4a2d6"} Apr 24 22:08:11.882725 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.882331 2571 scope.go:117] "RemoveContainer" containerID="3b26b7ee962bf2d8c108e0c4a15dd434329c83e0dd18f1f210e9c0d2166fb1a8" Apr 24 22:08:11.882725 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.882341 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj" Apr 24 22:08:11.890124 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.890107 2571 scope.go:117] "RemoveContainer" containerID="b5adab34e6dbd84f1817f3a50eea9a13bd6841bfa43c4f4d8323184f705539ad" Apr 24 22:08:11.897340 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.897313 2571 scope.go:117] "RemoveContainer" containerID="126565e890491d53c706c0229aaa0a5a5f802f710123c1df754be74c2c97c106" Apr 24 22:08:11.899797 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.899776 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj"] Apr 24 22:08:11.903152 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:11.903129 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-qtvnj"] Apr 24 22:08:12.887351 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:12.887290 2571 generic.go:358] "Generic (PLEG): container finished" podID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerID="5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8" exitCode=0 Apr 24 22:08:12.887769 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:12.887367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerDied","Data":"5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8"} Apr 24 22:08:13.648219 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:13.648184 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" path="/var/lib/kubelet/pods/1f86845d-a7a3-4031-b938-05e8f95d6ec0/volumes" Apr 24 22:08:13.891312 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:13.891254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerStarted","Data":"ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055"} Apr 24 22:08:13.891769 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:13.891321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerStarted","Data":"6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b"} Apr 24 22:08:13.891769 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:13.891551 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:13.912141 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:13.912052 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podStartSLOduration=6.912035462 podStartE2EDuration="6.912035462s" podCreationTimestamp="2026-04-24 22:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:08:13.910254123 +0000 UTC m=+2494.758298238" watchObservedRunningTime="2026-04-24 22:08:13.912035462 +0000 UTC m=+2494.760079579" Apr 24 22:08:14.894805 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:14.894767 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:14.895994 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:14.895965 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:08:15.897915 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:15.897874 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:08:20.901899 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:20.901873 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:08:20.902474 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:20.902449 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:08:30.903144 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:30.903068 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:08:40.903269 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:40.903236 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:08:50.902938 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:08:50.902895 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:09:00.903395 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:00.903355 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:09:10.902640 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:10.902599 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:09:20.903016 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:20.902986 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:09:27.726605 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.726563 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb"] Apr 24 22:09:27.727213 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.726987 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" containerID="cri-o://6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b" gracePeriod=30 Apr 24 22:09:27.727213 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.727101 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kube-rbac-proxy" containerID="cri-o://ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055" gracePeriod=30 Apr 24 22:09:27.821044 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821006 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp"] Apr 24 22:09:27.821408 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821390 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="storage-initializer" Apr 24 22:09:27.821521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821411 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="storage-initializer" Apr 24 22:09:27.821521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821424 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" Apr 24 22:09:27.821521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821432 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" Apr 24 22:09:27.821521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821449 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kube-rbac-proxy" Apr 24 22:09:27.821521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821458 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kube-rbac-proxy" Apr 24 22:09:27.821804 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821553 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kube-rbac-proxy" Apr 24 22:09:27.821804 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.821566 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f86845d-a7a3-4031-b938-05e8f95d6ec0" containerName="kserve-container" Apr 24 22:09:27.824723 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.824703 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.827122 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.827098 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 24 22:09:27.827249 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.827232 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:09:27.834124 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.834100 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp"] Apr 24 22:09:27.882060 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.882012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cf8bf01-da92-4486-bf31-f5a937bffdd7-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.882224 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.882104 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjfr8\" (UniqueName: \"kubernetes.io/projected/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kube-api-access-gjfr8\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.882224 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.882156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.882224 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.882196 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cf8bf01-da92-4486-bf31-f5a937bffdd7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.982623 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.982518 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjfr8\" (UniqueName: \"kubernetes.io/projected/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kube-api-access-gjfr8\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.982623 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.982573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.982623 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.982598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cf8bf01-da92-4486-bf31-f5a937bffdd7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.982927 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.982635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cf8bf01-da92-4486-bf31-f5a937bffdd7-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.983075 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.983052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.983356 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.983334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cf8bf01-da92-4486-bf31-f5a937bffdd7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.985533 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.985512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cf8bf01-da92-4486-bf31-f5a937bffdd7-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:27.991339 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:27.991289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjfr8\" (UniqueName: \"kubernetes.io/projected/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kube-api-access-gjfr8\") pod \"sklearn-v2-mlserver-predictor-65d8664766-746bp\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:28.103272 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:28.103239 2571 generic.go:358] "Generic (PLEG): container finished" podID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerID="ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055" exitCode=2 Apr 24 22:09:28.103482 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:28.103339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerDied","Data":"ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055"} Apr 24 22:09:28.136232 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:28.136193 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:28.265745 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:28.265709 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp"] Apr 24 22:09:28.268571 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:09:28.268542 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf8bf01_da92_4486_bf31_f5a937bffdd7.slice/crio-4607bcfbf0b049349976612f1d200a42a09b1837b3f59bcd16f94482bc4849d3 WatchSource:0}: Error finding container 4607bcfbf0b049349976612f1d200a42a09b1837b3f59bcd16f94482bc4849d3: Status 404 returned error can't find the container with id 4607bcfbf0b049349976612f1d200a42a09b1837b3f59bcd16f94482bc4849d3 Apr 24 22:09:29.107894 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:29.107861 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerStarted","Data":"dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03"} Apr 24 22:09:29.107894 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:29.107894 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerStarted","Data":"4607bcfbf0b049349976612f1d200a42a09b1837b3f59bcd16f94482bc4849d3"} Apr 24 22:09:30.898681 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:30.898628 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 24 22:09:30.902968 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:30.902941 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 22:09:32.116671 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.116637 2571 generic.go:358] "Generic (PLEG): container finished" podID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerID="dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03" exitCode=0 Apr 24 22:09:32.117080 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.116711 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerDied","Data":"dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03"} Apr 24 22:09:32.290097 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.290067 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:09:32.315551 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.315494 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194af666-7fcc-4d84-93d0-d8efdfda22f0-kserve-provision-location\") pod \"194af666-7fcc-4d84-93d0-d8efdfda22f0\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " Apr 24 22:09:32.315551 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.315537 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sqvt\" (UniqueName: \"kubernetes.io/projected/194af666-7fcc-4d84-93d0-d8efdfda22f0-kube-api-access-5sqvt\") pod \"194af666-7fcc-4d84-93d0-d8efdfda22f0\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " Apr 24 22:09:32.315773 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.315578 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls\") pod \"194af666-7fcc-4d84-93d0-d8efdfda22f0\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " Apr 24 22:09:32.315773 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.315608 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/194af666-7fcc-4d84-93d0-d8efdfda22f0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"194af666-7fcc-4d84-93d0-d8efdfda22f0\" (UID: \"194af666-7fcc-4d84-93d0-d8efdfda22f0\") " Apr 24 22:09:32.315940 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.315851 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194af666-7fcc-4d84-93d0-d8efdfda22f0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "194af666-7fcc-4d84-93d0-d8efdfda22f0" (UID: "194af666-7fcc-4d84-93d0-d8efdfda22f0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:32.316049 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.316025 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/194af666-7fcc-4d84-93d0-d8efdfda22f0-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "194af666-7fcc-4d84-93d0-d8efdfda22f0" (UID: "194af666-7fcc-4d84-93d0-d8efdfda22f0"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:09:32.318162 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.318132 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "194af666-7fcc-4d84-93d0-d8efdfda22f0" (UID: "194af666-7fcc-4d84-93d0-d8efdfda22f0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:09:32.318285 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.318169 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194af666-7fcc-4d84-93d0-d8efdfda22f0-kube-api-access-5sqvt" (OuterVolumeSpecName: "kube-api-access-5sqvt") pod "194af666-7fcc-4d84-93d0-d8efdfda22f0" (UID: "194af666-7fcc-4d84-93d0-d8efdfda22f0"). InnerVolumeSpecName "kube-api-access-5sqvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:09:32.416416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.416323 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194af666-7fcc-4d84-93d0-d8efdfda22f0-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:09:32.416416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.416354 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5sqvt\" (UniqueName: \"kubernetes.io/projected/194af666-7fcc-4d84-93d0-d8efdfda22f0-kube-api-access-5sqvt\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:09:32.416416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.416364 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/194af666-7fcc-4d84-93d0-d8efdfda22f0-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:09:32.416416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:32.416376 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/194af666-7fcc-4d84-93d0-d8efdfda22f0-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:09:33.122074 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.122037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerStarted","Data":"f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd"} Apr 24 22:09:33.122074 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.122079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerStarted","Data":"07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9"} Apr 24 22:09:33.122745 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.122338 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:33.123710 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.123685 2571 generic.go:358] "Generic (PLEG): container finished" podID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerID="6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b" exitCode=0 Apr 24 22:09:33.123836 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.123718 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerDied","Data":"6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b"} Apr 24 22:09:33.123836 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.123741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" event={"ID":"194af666-7fcc-4d84-93d0-d8efdfda22f0","Type":"ContainerDied","Data":"7d2c85389345f9907635301c01e1a7c284814a2104ee3687e8f57e4ee9987b7b"} Apr 24 22:09:33.123836 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.123749 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb" Apr 24 22:09:33.123836 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.123760 2571 scope.go:117] "RemoveContainer" containerID="ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055" Apr 24 22:09:33.132819 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.132801 2571 scope.go:117] "RemoveContainer" containerID="6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b" Apr 24 22:09:33.140353 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.140334 2571 scope.go:117] "RemoveContainer" containerID="5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8" Apr 24 22:09:33.145844 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.145799 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" podStartSLOduration=6.145782225 podStartE2EDuration="6.145782225s" podCreationTimestamp="2026-04-24 22:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:09:33.143709245 +0000 UTC m=+2573.991753456" watchObservedRunningTime="2026-04-24 22:09:33.145782225 +0000 UTC m=+2573.993826342" Apr 24 22:09:33.148246 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.148231 2571 scope.go:117] "RemoveContainer" containerID="ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055" Apr 24 22:09:33.148558 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:09:33.148540 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055\": container with ID starting with ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055 not found: ID does not exist" containerID="ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055" Apr 24 22:09:33.148634 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.148564 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055"} err="failed to get container status \"ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055\": rpc error: code = NotFound desc = could not find container \"ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055\": container with ID starting with ee690d234dc4ad634f34b436960b04976d6b25eb70b92a56f74b7682bf53d055 not found: ID does not exist" Apr 24 22:09:33.148634 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.148580 2571 scope.go:117] "RemoveContainer" containerID="6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b" Apr 24 22:09:33.148818 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:09:33.148802 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b\": container with ID starting with 6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b not found: ID does not exist" containerID="6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b" Apr 24 22:09:33.148872 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.148822 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b"} err="failed to get container status \"6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b\": rpc error: code = NotFound desc = could not find container \"6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b\": container with ID starting with 6e61cf83bc09fe6ff1c1295ddaaf312bfc8144ea3b5844fe4f879ab81e929b9b not found: ID does not exist" Apr 24 22:09:33.148872 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.148835 2571 scope.go:117] "RemoveContainer" containerID="5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8" Apr 24 22:09:33.149044 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:09:33.149028 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8\": container with ID starting with 5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8 not found: ID does not exist" containerID="5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8" Apr 24 22:09:33.149087 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.149047 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8"} err="failed to get container status \"5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8\": rpc error: code = NotFound desc = could not find container \"5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8\": container with ID starting with 5fad1f3f8a9b8a1ff4ddd1e0d6fd61fe77ca98bc4b5be025857aac0e7d0804d8 not found: ID does not exist" Apr 24 22:09:33.157003 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.156983 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb"] Apr 24 22:09:33.162345 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.162324 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-77f5c96b44-5d9wb"] Apr 24 22:09:33.647692 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:33.647653 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" path="/var/lib/kubelet/pods/194af666-7fcc-4d84-93d0-d8efdfda22f0/volumes" Apr 24 22:09:34.128255 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:34.128229 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:09:40.136142 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:09:40.136113 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:10:10.225923 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:10.225887 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 22:10:20.139229 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:20.139198 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:10:27.928349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.928308 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp"] Apr 24 22:10:27.928838 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.928808 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kserve-container" containerID="cri-o://07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9" gracePeriod=30 Apr 24 22:10:27.928921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.928847 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kube-rbac-proxy" containerID="cri-o://f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd" gracePeriod=30 Apr 24 22:10:27.993153 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993113 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5"] Apr 24 22:10:27.993435 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993422 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="storage-initializer" Apr 24 22:10:27.993492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993438 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="storage-initializer" Apr 24 22:10:27.993492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993446 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" Apr 24 22:10:27.993492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993451 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" Apr 24 22:10:27.993492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993465 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kube-rbac-proxy" Apr 24 22:10:27.993492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993472 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kube-rbac-proxy" Apr 24 22:10:27.993651 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993513 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kube-rbac-proxy" Apr 24 22:10:27.993651 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.993520 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="194af666-7fcc-4d84-93d0-d8efdfda22f0" containerName="kserve-container" Apr 24 22:10:27.996545 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.996528 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:27.998883 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.998852 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:10:27.999005 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:27.998867 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 24 22:10:28.005075 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.005053 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5"] Apr 24 22:10:28.141330 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.141283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53dd487b-1df4-437f-b358-c066279de55c-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.141494 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.141337 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53dd487b-1df4-437f-b358-c066279de55c-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.141494 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.141372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53dd487b-1df4-437f-b358-c066279de55c-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.141494 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.141393 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x7xs\" (UniqueName: \"kubernetes.io/projected/53dd487b-1df4-437f-b358-c066279de55c-kube-api-access-7x7xs\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.242562 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.242535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53dd487b-1df4-437f-b358-c066279de55c-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.242692 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.242573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x7xs\" (UniqueName: \"kubernetes.io/projected/53dd487b-1df4-437f-b358-c066279de55c-kube-api-access-7x7xs\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.242692 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.242631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53dd487b-1df4-437f-b358-c066279de55c-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.242692 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.242652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53dd487b-1df4-437f-b358-c066279de55c-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.243050 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.243030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53dd487b-1df4-437f-b358-c066279de55c-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.243229 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.243210 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53dd487b-1df4-437f-b358-c066279de55c-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.245267 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.245243 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53dd487b-1df4-437f-b358-c066279de55c-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.251653 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.251628 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x7xs\" (UniqueName: \"kubernetes.io/projected/53dd487b-1df4-437f-b358-c066279de55c-kube-api-access-7x7xs\") pod \"isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.276083 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.276059 2571 generic.go:358] "Generic (PLEG): container finished" podID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerID="f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd" exitCode=2 Apr 24 22:10:28.276206 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.276182 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerDied","Data":"f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd"} Apr 24 22:10:28.308352 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.308327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:28.434355 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:28.431885 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5"] Apr 24 22:10:28.436319 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:10:28.436261 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53dd487b_1df4_437f_b358_c066279de55c.slice/crio-86e44b151512b7ee3964a966052e899a249d5177f8df338a7a6d0c6667c4dc6f WatchSource:0}: Error finding container 86e44b151512b7ee3964a966052e899a249d5177f8df338a7a6d0c6667c4dc6f: Status 404 returned error can't find the container with id 86e44b151512b7ee3964a966052e899a249d5177f8df338a7a6d0c6667c4dc6f Apr 24 22:10:29.279836 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:29.279793 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerStarted","Data":"7484dc0f834208137fdde11ec00c234c76d46893b11252b33fea48ed2b460455"} Apr 24 22:10:29.279836 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:29.279838 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerStarted","Data":"86e44b151512b7ee3964a966052e899a249d5177f8df338a7a6d0c6667c4dc6f"} Apr 24 22:10:30.131487 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:30.131449 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 24 22:10:31.178470 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:31.178424 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/sklearn-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 22:10:34.296051 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:34.296024 2571 generic.go:358] "Generic (PLEG): container finished" podID="53dd487b-1df4-437f-b358-c066279de55c" containerID="7484dc0f834208137fdde11ec00c234c76d46893b11252b33fea48ed2b460455" exitCode=0 Apr 24 22:10:34.296358 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:34.296085 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerDied","Data":"7484dc0f834208137fdde11ec00c234c76d46893b11252b33fea48ed2b460455"} Apr 24 22:10:35.131263 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.131213 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 24 22:10:35.301128 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.301090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerStarted","Data":"a5b42c0c23434cd10846bfbe064c37679b4dbc9d21dca07a29c0b60edf215c2c"} Apr 24 22:10:35.301535 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.301138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerStarted","Data":"8f37ceb4f6118e25db23cffbb077b8c9f73b4881df05efbb765c8420dbcb3295"} Apr 24 22:10:35.301535 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.301349 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:35.323547 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.323492 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" podStartSLOduration=8.323473907 podStartE2EDuration="8.323473907s" podCreationTimestamp="2026-04-24 22:10:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:10:35.322036395 +0000 UTC m=+2636.170080511" watchObservedRunningTime="2026-04-24 22:10:35.323473907 +0000 UTC m=+2636.171518024" Apr 24 22:10:35.769200 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.769174 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:10:35.906638 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.906558 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kserve-provision-location\") pod \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " Apr 24 22:10:35.906638 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.906628 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cf8bf01-da92-4486-bf31-f5a937bffdd7-proxy-tls\") pod \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " Apr 24 22:10:35.906843 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.906670 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjfr8\" (UniqueName: \"kubernetes.io/projected/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kube-api-access-gjfr8\") pod \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " Apr 24 22:10:35.906843 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.906708 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cf8bf01-da92-4486-bf31-f5a937bffdd7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\" (UID: \"4cf8bf01-da92-4486-bf31-f5a937bffdd7\") " Apr 24 22:10:35.906960 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.906913 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4cf8bf01-da92-4486-bf31-f5a937bffdd7" (UID: "4cf8bf01-da92-4486-bf31-f5a937bffdd7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:35.907101 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.907074 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf8bf01-da92-4486-bf31-f5a937bffdd7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "4cf8bf01-da92-4486-bf31-f5a937bffdd7" (UID: "4cf8bf01-da92-4486-bf31-f5a937bffdd7"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:10:35.908939 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.908911 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kube-api-access-gjfr8" (OuterVolumeSpecName: "kube-api-access-gjfr8") pod "4cf8bf01-da92-4486-bf31-f5a937bffdd7" (UID: "4cf8bf01-da92-4486-bf31-f5a937bffdd7"). InnerVolumeSpecName "kube-api-access-gjfr8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:10:35.909063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:35.908995 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cf8bf01-da92-4486-bf31-f5a937bffdd7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4cf8bf01-da92-4486-bf31-f5a937bffdd7" (UID: "4cf8bf01-da92-4486-bf31-f5a937bffdd7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:10:36.007698 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.007657 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cf8bf01-da92-4486-bf31-f5a937bffdd7-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:10:36.007698 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.007692 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjfr8\" (UniqueName: \"kubernetes.io/projected/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kube-api-access-gjfr8\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:10:36.007896 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.007709 2571 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cf8bf01-da92-4486-bf31-f5a937bffdd7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:10:36.007896 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.007723 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cf8bf01-da92-4486-bf31-f5a937bffdd7-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:10:36.305681 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.305638 2571 generic.go:358] "Generic (PLEG): container finished" podID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerID="07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9" exitCode=0 Apr 24 22:10:36.306154 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.305715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerDied","Data":"07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9"} Apr 24 22:10:36.306154 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.305759 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" Apr 24 22:10:36.306154 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.305780 2571 scope.go:117] "RemoveContainer" containerID="f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd" Apr 24 22:10:36.306154 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.305764 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp" event={"ID":"4cf8bf01-da92-4486-bf31-f5a937bffdd7","Type":"ContainerDied","Data":"4607bcfbf0b049349976612f1d200a42a09b1837b3f59bcd16f94482bc4849d3"} Apr 24 22:10:36.306413 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.306342 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:36.307805 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.307767 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 22:10:36.315482 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.315467 2571 scope.go:117] "RemoveContainer" containerID="07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9" Apr 24 22:10:36.322323 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.322283 2571 scope.go:117] "RemoveContainer" containerID="dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03" Apr 24 22:10:36.327195 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.327176 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp"] Apr 24 22:10:36.330124 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.330105 2571 scope.go:117] "RemoveContainer" containerID="f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd" Apr 24 22:10:36.330495 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:10:36.330468 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd\": container with ID starting with f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd not found: ID does not exist" containerID="f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd" Apr 24 22:10:36.330556 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.330507 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd"} err="failed to get container status \"f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd\": rpc error: code = NotFound desc = could not find container \"f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd\": container with ID starting with f01b5cc23a84e4ba8d5871572528fa96aa65a2c1bd21b015b6369cbada4b25bd not found: ID does not exist" Apr 24 22:10:36.330556 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.330531 2571 scope.go:117] "RemoveContainer" containerID="07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9" Apr 24 22:10:36.330840 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:10:36.330820 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9\": container with ID starting with 07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9 not found: ID does not exist" containerID="07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9" Apr 24 22:10:36.330915 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.330849 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9"} err="failed to get container status \"07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9\": rpc error: code = NotFound desc = could not find container \"07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9\": container with ID starting with 07922b918fb9c803d84578046dd7be4dd0847d1edb683ad6bc1aca96009705e9 not found: ID does not exist" Apr 24 22:10:36.330915 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.330861 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-746bp"] Apr 24 22:10:36.330915 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.330872 2571 scope.go:117] "RemoveContainer" containerID="dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03" Apr 24 22:10:36.331124 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:10:36.331108 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03\": container with ID starting with dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03 not found: ID does not exist" containerID="dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03" Apr 24 22:10:36.331163 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:36.331132 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03"} err="failed to get container status \"dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03\": rpc error: code = NotFound desc = could not find container \"dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03\": container with ID starting with dd42906e8f503672091188e31eff5ab0076b2acaaff2154c32736afd09b63c03 not found: ID does not exist" Apr 24 22:10:37.310369 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:37.310327 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 22:10:37.647913 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:37.647833 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" path="/var/lib/kubelet/pods/4cf8bf01-da92-4486-bf31-f5a937bffdd7/volumes" Apr 24 22:10:42.315068 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:42.315034 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:10:42.315628 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:42.315604 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 22:10:52.316152 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:10:52.316116 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:11:05.041006 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.040973 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5_53dd487b-1df4-437f-b358-c066279de55c/kserve-container/0.log" Apr 24 22:11:05.295184 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.295106 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5"] Apr 24 22:11:05.295505 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.295444 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kserve-container" containerID="cri-o://8f37ceb4f6118e25db23cffbb077b8c9f73b4881df05efbb765c8420dbcb3295" gracePeriod=30 Apr 24 22:11:05.295505 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.295482 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kube-rbac-proxy" containerID="cri-o://a5b42c0c23434cd10846bfbe064c37679b4dbc9d21dca07a29c0b60edf215c2c" gracePeriod=30 Apr 24 22:11:05.371104 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371072 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px"] Apr 24 22:11:05.371385 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371373 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="storage-initializer" Apr 24 22:11:05.371438 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371386 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="storage-initializer" Apr 24 22:11:05.371438 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371399 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kserve-container" Apr 24 22:11:05.371438 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371404 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kserve-container" Apr 24 22:11:05.371438 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371413 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kube-rbac-proxy" Apr 24 22:11:05.371438 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371418 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kube-rbac-proxy" Apr 24 22:11:05.371586 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371480 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kube-rbac-proxy" Apr 24 22:11:05.371586 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.371491 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cf8bf01-da92-4486-bf31-f5a937bffdd7" containerName="kserve-container" Apr 24 22:11:05.374504 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.374488 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.377202 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.377174 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 24 22:11:05.377352 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.377209 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:11:05.387895 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.387874 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px"] Apr 24 22:11:05.435451 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.435409 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.435451 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.435454 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4df2w\" (UniqueName: \"kubernetes.io/projected/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kube-api-access-4df2w\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.435644 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.435507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.435644 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.435547 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.536349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.536315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.536541 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.536373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.536541 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.536392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4df2w\" (UniqueName: \"kubernetes.io/projected/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kube-api-access-4df2w\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.536541 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.536431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.536734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.536713 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.537018 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.536995 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.538971 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.538944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.545289 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.545227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4df2w\" (UniqueName: \"kubernetes.io/projected/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kube-api-access-4df2w\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.683921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.683884 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:05.815544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:05.815516 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px"] Apr 24 22:11:06.397092 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.397063 2571 generic.go:358] "Generic (PLEG): container finished" podID="53dd487b-1df4-437f-b358-c066279de55c" containerID="a5b42c0c23434cd10846bfbe064c37679b4dbc9d21dca07a29c0b60edf215c2c" exitCode=2 Apr 24 22:11:06.397092 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.397087 2571 generic.go:358] "Generic (PLEG): container finished" podID="53dd487b-1df4-437f-b358-c066279de55c" containerID="8f37ceb4f6118e25db23cffbb077b8c9f73b4881df05efbb765c8420dbcb3295" exitCode=0 Apr 24 22:11:06.397526 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.397134 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerDied","Data":"a5b42c0c23434cd10846bfbe064c37679b4dbc9d21dca07a29c0b60edf215c2c"} Apr 24 22:11:06.397526 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.397173 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerDied","Data":"8f37ceb4f6118e25db23cffbb077b8c9f73b4881df05efbb765c8420dbcb3295"} Apr 24 22:11:06.398426 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.398395 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerStarted","Data":"0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0"} Apr 24 22:11:06.398554 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.398436 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerStarted","Data":"4b102e4b31e7a8b2e5098bcde3e96bc5e11b8a8a83062b433501f3e115d2c8d4"} Apr 24 22:11:06.432877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.432853 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:11:06.543407 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.543317 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53dd487b-1df4-437f-b358-c066279de55c-kserve-provision-location\") pod \"53dd487b-1df4-437f-b358-c066279de55c\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " Apr 24 22:11:06.543407 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.543376 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53dd487b-1df4-437f-b358-c066279de55c-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"53dd487b-1df4-437f-b358-c066279de55c\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " Apr 24 22:11:06.543643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.543411 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53dd487b-1df4-437f-b358-c066279de55c-proxy-tls\") pod \"53dd487b-1df4-437f-b358-c066279de55c\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " Apr 24 22:11:06.543643 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.543446 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x7xs\" (UniqueName: \"kubernetes.io/projected/53dd487b-1df4-437f-b358-c066279de55c-kube-api-access-7x7xs\") pod \"53dd487b-1df4-437f-b358-c066279de55c\" (UID: \"53dd487b-1df4-437f-b358-c066279de55c\") " Apr 24 22:11:06.543736 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.543715 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53dd487b-1df4-437f-b358-c066279de55c-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "53dd487b-1df4-437f-b358-c066279de55c" (UID: "53dd487b-1df4-437f-b358-c066279de55c"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:11:06.545695 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.545671 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53dd487b-1df4-437f-b358-c066279de55c-kube-api-access-7x7xs" (OuterVolumeSpecName: "kube-api-access-7x7xs") pod "53dd487b-1df4-437f-b358-c066279de55c" (UID: "53dd487b-1df4-437f-b358-c066279de55c"). InnerVolumeSpecName "kube-api-access-7x7xs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:11:06.545797 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.545694 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53dd487b-1df4-437f-b358-c066279de55c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "53dd487b-1df4-437f-b358-c066279de55c" (UID: "53dd487b-1df4-437f-b358-c066279de55c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:11:06.573221 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.573184 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53dd487b-1df4-437f-b358-c066279de55c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53dd487b-1df4-437f-b358-c066279de55c" (UID: "53dd487b-1df4-437f-b358-c066279de55c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:06.644123 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.644090 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53dd487b-1df4-437f-b358-c066279de55c-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:11:06.644123 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.644114 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53dd487b-1df4-437f-b358-c066279de55c-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:11:06.644123 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.644124 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53dd487b-1df4-437f-b358-c066279de55c-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:11:06.644123 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:06.644133 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7x7xs\" (UniqueName: \"kubernetes.io/projected/53dd487b-1df4-437f-b358-c066279de55c-kube-api-access-7x7xs\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:11:07.402900 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.402872 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" Apr 24 22:11:07.402900 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.402876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5" event={"ID":"53dd487b-1df4-437f-b358-c066279de55c","Type":"ContainerDied","Data":"86e44b151512b7ee3964a966052e899a249d5177f8df338a7a6d0c6667c4dc6f"} Apr 24 22:11:07.403442 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.402926 2571 scope.go:117] "RemoveContainer" containerID="a5b42c0c23434cd10846bfbe064c37679b4dbc9d21dca07a29c0b60edf215c2c" Apr 24 22:11:07.411509 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.411491 2571 scope.go:117] "RemoveContainer" containerID="8f37ceb4f6118e25db23cffbb077b8c9f73b4881df05efbb765c8420dbcb3295" Apr 24 22:11:07.418287 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.418269 2571 scope.go:117] "RemoveContainer" containerID="7484dc0f834208137fdde11ec00c234c76d46893b11252b33fea48ed2b460455" Apr 24 22:11:07.424949 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.424926 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5"] Apr 24 22:11:07.428821 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.428800 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7b5dc59794-c5rz5"] Apr 24 22:11:07.647702 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:07.647669 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53dd487b-1df4-437f-b358-c066279de55c" path="/var/lib/kubelet/pods/53dd487b-1df4-437f-b358-c066279de55c/volumes" Apr 24 22:11:09.410122 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:09.410091 2571 generic.go:358] "Generic (PLEG): container finished" podID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerID="0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0" exitCode=0 Apr 24 22:11:09.410493 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:09.410176 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerDied","Data":"0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0"} Apr 24 22:11:10.414610 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:10.414576 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerStarted","Data":"4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588"} Apr 24 22:11:10.414610 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:10.414618 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerStarted","Data":"0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef"} Apr 24 22:11:10.415098 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:10.414843 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:10.415098 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:10.414868 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:10.436697 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:10.436652 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" podStartSLOduration=5.436638388 podStartE2EDuration="5.436638388s" podCreationTimestamp="2026-04-24 22:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:11:10.434609608 +0000 UTC m=+2671.282653722" watchObservedRunningTime="2026-04-24 22:11:10.436638388 +0000 UTC m=+2671.284682501" Apr 24 22:11:16.423709 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:16.423677 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:11:46.427400 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:46.427353 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 22:11:56.426712 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:11:56.426684 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:12:05.404086 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.404043 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px"] Apr 24 22:12:05.404630 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.404497 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kserve-container" containerID="cri-o://0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef" gracePeriod=30 Apr 24 22:12:05.404630 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.404544 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kube-rbac-proxy" containerID="cri-o://4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588" gracePeriod=30 Apr 24 22:12:05.463805 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.463774 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc"] Apr 24 22:12:05.464086 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464073 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kube-rbac-proxy" Apr 24 22:12:05.464137 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464089 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kube-rbac-proxy" Apr 24 22:12:05.464137 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464106 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kserve-container" Apr 24 22:12:05.464137 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464111 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kserve-container" Apr 24 22:12:05.464137 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464118 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="storage-initializer" Apr 24 22:12:05.464137 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464124 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="storage-initializer" Apr 24 22:12:05.464331 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464165 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kube-rbac-proxy" Apr 24 22:12:05.464331 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.464174 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="53dd487b-1df4-437f-b358-c066279de55c" containerName="kserve-container" Apr 24 22:12:05.467390 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.467371 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.469897 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.469872 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 24 22:12:05.470006 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.469876 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:12:05.478406 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.478372 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc"] Apr 24 22:12:05.515065 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.515036 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.515221 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.515082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wln9p\" (UniqueName: \"kubernetes.io/projected/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kube-api-access-wln9p\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.515221 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.515108 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.515221 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.515124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.566116 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.566086 2571 generic.go:358] "Generic (PLEG): container finished" podID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerID="4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588" exitCode=2 Apr 24 22:12:05.566279 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.566143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerDied","Data":"4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588"} Apr 24 22:12:05.615719 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.615684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.615914 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.615739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wln9p\" (UniqueName: \"kubernetes.io/projected/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kube-api-access-wln9p\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.615914 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.615767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.615914 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.615786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.616061 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:12:05.615932 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 22:12:05.616061 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:12:05.616014 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls podName:f66c71d6-aff1-494e-9edd-fcbd1a1de6d3 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:06.115992393 +0000 UTC m=+2726.964036488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls") pod "isvc-sklearn-v2-predictor-7c9dd679db-q64wc" (UID: "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 22:12:05.616169 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.616149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.616450 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.616434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:05.628972 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:05.628948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wln9p\" (UniqueName: \"kubernetes.io/projected/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kube-api-access-wln9p\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:06.119907 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:06.119851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:06.122392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:06.122363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls\") pod \"isvc-sklearn-v2-predictor-7c9dd679db-q64wc\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:06.377523 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:06.377433 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:06.418737 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:06.418684 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.47:8643/healthz\": dial tcp 10.134.0.47:8643: connect: connection refused" Apr 24 22:12:06.503091 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:06.502847 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc"] Apr 24 22:12:06.505472 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:12:06.505437 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf66c71d6_aff1_494e_9edd_fcbd1a1de6d3.slice/crio-dcefda9c9d6c10aa4f0a5f45ed3474bfa96bc2e6af4511fd325cde300a1fac80 WatchSource:0}: Error finding container dcefda9c9d6c10aa4f0a5f45ed3474bfa96bc2e6af4511fd325cde300a1fac80: Status 404 returned error can't find the container with id dcefda9c9d6c10aa4f0a5f45ed3474bfa96bc2e6af4511fd325cde300a1fac80 Apr 24 22:12:06.572582 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:06.572556 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerStarted","Data":"5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922"} Apr 24 22:12:06.572687 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:06.572591 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerStarted","Data":"dcefda9c9d6c10aa4f0a5f45ed3474bfa96bc2e6af4511fd325cde300a1fac80"} Apr 24 22:12:07.465562 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:07.465466 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 22:12:10.586108 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:10.586075 2571 generic.go:358] "Generic (PLEG): container finished" podID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerID="5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922" exitCode=0 Apr 24 22:12:10.586543 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:10.586160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerDied","Data":"5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922"} Apr 24 22:12:11.419370 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:11.419288 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.47:8643/healthz\": dial tcp 10.134.0.47:8643: connect: connection refused" Apr 24 22:12:11.591365 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:11.591330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerStarted","Data":"73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4"} Apr 24 22:12:11.591365 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:11.591370 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerStarted","Data":"9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e"} Apr 24 22:12:11.591805 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:11.591696 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:11.591805 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:11.591723 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:11.593096 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:11.593070 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:12:11.611671 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:11.611629 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podStartSLOduration=6.611614929 podStartE2EDuration="6.611614929s" podCreationTimestamp="2026-04-24 22:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:12:11.610671425 +0000 UTC m=+2732.458715544" watchObservedRunningTime="2026-04-24 22:12:11.611614929 +0000 UTC m=+2732.459659039" Apr 24 22:12:12.594615 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:12.594576 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:12:13.047452 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.047425 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:12:13.075870 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.075843 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4df2w\" (UniqueName: \"kubernetes.io/projected/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kube-api-access-4df2w\") pod \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " Apr 24 22:12:13.076033 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.075922 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kserve-provision-location\") pod \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " Apr 24 22:12:13.076033 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.075990 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-proxy-tls\") pod \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " Apr 24 22:12:13.076136 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.076039 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\" (UID: \"c7d74a30-6d62-4f66-bcc1-33f4da693fe8\") " Apr 24 22:12:13.076271 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.076231 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7d74a30-6d62-4f66-bcc1-33f4da693fe8" (UID: "c7d74a30-6d62-4f66-bcc1-33f4da693fe8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:13.076633 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.076603 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "c7d74a30-6d62-4f66-bcc1-33f4da693fe8" (UID: "c7d74a30-6d62-4f66-bcc1-33f4da693fe8"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:12:13.078205 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.078176 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kube-api-access-4df2w" (OuterVolumeSpecName: "kube-api-access-4df2w") pod "c7d74a30-6d62-4f66-bcc1-33f4da693fe8" (UID: "c7d74a30-6d62-4f66-bcc1-33f4da693fe8"). InnerVolumeSpecName "kube-api-access-4df2w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:12:13.078205 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.078197 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7d74a30-6d62-4f66-bcc1-33f4da693fe8" (UID: "c7d74a30-6d62-4f66-bcc1-33f4da693fe8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:12:13.176896 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.176797 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:12:13.176896 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.176835 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:12:13.176896 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.176847 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4df2w\" (UniqueName: \"kubernetes.io/projected/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kube-api-access-4df2w\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:12:13.176896 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.176857 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d74a30-6d62-4f66-bcc1-33f4da693fe8-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:12:13.598359 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.598323 2571 generic.go:358] "Generic (PLEG): container finished" podID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerID="0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef" exitCode=0 Apr 24 22:12:13.598741 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.598431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerDied","Data":"0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef"} Apr 24 22:12:13.598741 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.598454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" event={"ID":"c7d74a30-6d62-4f66-bcc1-33f4da693fe8","Type":"ContainerDied","Data":"4b102e4b31e7a8b2e5098bcde3e96bc5e11b8a8a83062b433501f3e115d2c8d4"} Apr 24 22:12:13.598741 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.598457 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px" Apr 24 22:12:13.598741 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.598469 2571 scope.go:117] "RemoveContainer" containerID="4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588" Apr 24 22:12:13.606537 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.606370 2571 scope.go:117] "RemoveContainer" containerID="0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef" Apr 24 22:12:13.613515 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.613496 2571 scope.go:117] "RemoveContainer" containerID="0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0" Apr 24 22:12:13.620240 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.620217 2571 scope.go:117] "RemoveContainer" containerID="4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588" Apr 24 22:12:13.620576 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:12:13.620552 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588\": container with ID starting with 4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588 not found: ID does not exist" containerID="4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588" Apr 24 22:12:13.620670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.620586 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588"} err="failed to get container status \"4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588\": rpc error: code = NotFound desc = could not find container \"4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588\": container with ID starting with 4f9830e41ffe3e64ef3d34791d230fbde78b784e79abed536950c71da2534588 not found: ID does not exist" Apr 24 22:12:13.620670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.620610 2571 scope.go:117] "RemoveContainer" containerID="0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef" Apr 24 22:12:13.620670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.620656 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px"] Apr 24 22:12:13.620876 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:12:13.620849 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef\": container with ID starting with 0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef not found: ID does not exist" containerID="0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef" Apr 24 22:12:13.620920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.620887 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef"} err="failed to get container status \"0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef\": rpc error: code = NotFound desc = could not find container \"0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef\": container with ID starting with 0f703f7eb9678946bf096d1bd23c6b37a320de37b2762c699b6ab8abcb4744ef not found: ID does not exist" Apr 24 22:12:13.620920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.620910 2571 scope.go:117] "RemoveContainer" containerID="0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0" Apr 24 22:12:13.621143 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:12:13.621127 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0\": container with ID starting with 0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0 not found: ID does not exist" containerID="0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0" Apr 24 22:12:13.621200 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.621150 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0"} err="failed to get container status \"0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0\": rpc error: code = NotFound desc = could not find container \"0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0\": container with ID starting with 0c724eafa2ef50846d21961d2b8c1cc485abf08a5f07ad4e0a133ea2c99beaf0 not found: ID does not exist" Apr 24 22:12:13.628359 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.626436 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-884px"] Apr 24 22:12:13.647388 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:13.647358 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" path="/var/lib/kubelet/pods/c7d74a30-6d62-4f66-bcc1-33f4da693fe8/volumes" Apr 24 22:12:17.599090 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:17.599062 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:12:17.599613 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:17.599586 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:12:27.600211 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:27.600170 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:12:37.600248 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:37.600207 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:12:47.600521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:47.600470 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:12:57.599995 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:12:57.599951 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:13:07.600361 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:07.600316 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:13:17.600488 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:17.600458 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:13:25.659415 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.659380 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc"] Apr 24 22:13:25.659861 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.659810 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" containerID="cri-o://9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e" gracePeriod=30 Apr 24 22:13:25.659927 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.659811 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kube-rbac-proxy" containerID="cri-o://73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4" gracePeriod=30 Apr 24 22:13:25.712541 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712506 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg"] Apr 24 22:13:25.712813 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712801 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kserve-container" Apr 24 22:13:25.712868 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712815 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kserve-container" Apr 24 22:13:25.712868 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712830 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="storage-initializer" Apr 24 22:13:25.712868 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712835 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="storage-initializer" Apr 24 22:13:25.712868 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712847 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kube-rbac-proxy" Apr 24 22:13:25.712868 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712853 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kube-rbac-proxy" Apr 24 22:13:25.713020 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712896 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kube-rbac-proxy" Apr 24 22:13:25.713020 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.712907 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d74a30-6d62-4f66-bcc1-33f4da693fe8" containerName="kserve-container" Apr 24 22:13:25.715932 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.715916 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.718142 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.718123 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 24 22:13:25.718256 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.718128 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 24 22:13:25.725955 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.725934 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg"] Apr 24 22:13:25.744454 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.744432 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.744587 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.744467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.744587 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.744578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg9ql\" (UniqueName: \"kubernetes.io/projected/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kube-api-access-xg9ql\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.744686 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.744612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.799951 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.799918 2571 generic.go:358] "Generic (PLEG): container finished" podID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerID="73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4" exitCode=2 Apr 24 22:13:25.800087 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.799970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerDied","Data":"73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4"} Apr 24 22:13:25.845582 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.845551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.845664 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.845589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.845664 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.845654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xg9ql\" (UniqueName: \"kubernetes.io/projected/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kube-api-access-xg9ql\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.845759 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.845688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.845833 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:13:25.845817 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 24 22:13:25.845902 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:13:25.845887 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls podName:5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8 nodeName:}" failed. No retries permitted until 2026-04-24 22:13:26.345867848 +0000 UTC m=+2807.193911942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" (UID: "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 24 22:13:25.845993 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.845977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.846266 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.846246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:25.857301 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:25.857268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg9ql\" (UniqueName: \"kubernetes.io/projected/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kube-api-access-xg9ql\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:26.350478 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:26.350443 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:26.352984 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:26.352955 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:26.625895 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:26.625812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:26.743587 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:26.743556 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg"] Apr 24 22:13:26.746476 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:13:26.746444 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f3bcb9a_d72a_489e_b7eb_eab4e0de9eb8.slice/crio-e53920a53bb8ce40f0d9ee14150eecc3af7898e19451ffd95928f6624f586135 WatchSource:0}: Error finding container e53920a53bb8ce40f0d9ee14150eecc3af7898e19451ffd95928f6624f586135: Status 404 returned error can't find the container with id e53920a53bb8ce40f0d9ee14150eecc3af7898e19451ffd95928f6624f586135 Apr 24 22:13:26.748237 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:26.748220 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:13:26.803446 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:26.803416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerStarted","Data":"e53920a53bb8ce40f0d9ee14150eecc3af7898e19451ffd95928f6624f586135"} Apr 24 22:13:27.594866 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:27.594825 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.48:8643/healthz\": dial tcp 10.134.0.48:8643: connect: connection refused" Apr 24 22:13:27.600112 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:27.600092 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 22:13:27.807677 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:27.807640 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerStarted","Data":"f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa"} Apr 24 22:13:29.688732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.688701 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:13:29.777327 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.777276 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls\") pod \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " Apr 24 22:13:29.777478 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.777361 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kserve-provision-location\") pod \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " Apr 24 22:13:29.777478 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.777392 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wln9p\" (UniqueName: \"kubernetes.io/projected/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kube-api-access-wln9p\") pod \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " Apr 24 22:13:29.777478 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.777434 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\" (UID: \"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3\") " Apr 24 22:13:29.777727 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.777700 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" (UID: "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:29.777814 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.777790 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" (UID: "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:13:29.779609 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.779558 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" (UID: "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:13:29.779681 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.779618 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kube-api-access-wln9p" (OuterVolumeSpecName: "kube-api-access-wln9p") pod "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" (UID: "f66c71d6-aff1-494e-9edd-fcbd1a1de6d3"). InnerVolumeSpecName "kube-api-access-wln9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:13:29.814521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.814492 2571 generic.go:358] "Generic (PLEG): container finished" podID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerID="9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e" exitCode=0 Apr 24 22:13:29.814628 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.814574 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" Apr 24 22:13:29.814628 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.814572 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerDied","Data":"9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e"} Apr 24 22:13:29.814628 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.814616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc" event={"ID":"f66c71d6-aff1-494e-9edd-fcbd1a1de6d3","Type":"ContainerDied","Data":"dcefda9c9d6c10aa4f0a5f45ed3474bfa96bc2e6af4511fd325cde300a1fac80"} Apr 24 22:13:29.814737 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.814637 2571 scope.go:117] "RemoveContainer" containerID="73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4" Apr 24 22:13:29.822741 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.822721 2571 scope.go:117] "RemoveContainer" containerID="9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e" Apr 24 22:13:29.829650 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.829634 2571 scope.go:117] "RemoveContainer" containerID="5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922" Apr 24 22:13:29.837464 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.837448 2571 scope.go:117] "RemoveContainer" containerID="73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4" Apr 24 22:13:29.837718 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:13:29.837694 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4\": container with ID starting with 73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4 not found: ID does not exist" containerID="73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4" Apr 24 22:13:29.837808 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.837725 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4"} err="failed to get container status \"73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4\": rpc error: code = NotFound desc = could not find container \"73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4\": container with ID starting with 73190ada39940c8decddf147b89dafc2e9389bf0ce2b85f0fa237347dbb127e4 not found: ID does not exist" Apr 24 22:13:29.837808 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.837742 2571 scope.go:117] "RemoveContainer" containerID="9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e" Apr 24 22:13:29.838131 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.838105 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc"] Apr 24 22:13:29.838220 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:13:29.838183 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e\": container with ID starting with 9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e not found: ID does not exist" containerID="9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e" Apr 24 22:13:29.838285 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.838220 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e"} err="failed to get container status \"9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e\": rpc error: code = NotFound desc = could not find container \"9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e\": container with ID starting with 9813a4f616a5d6e0eb4093b46835042fcb7c2dba4da08a7e8c165bc0c94ab47e not found: ID does not exist" Apr 24 22:13:29.838285 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.838245 2571 scope.go:117] "RemoveContainer" containerID="5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922" Apr 24 22:13:29.838681 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:13:29.838663 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922\": container with ID starting with 5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922 not found: ID does not exist" containerID="5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922" Apr 24 22:13:29.838745 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.838686 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922"} err="failed to get container status \"5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922\": rpc error: code = NotFound desc = could not find container \"5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922\": container with ID starting with 5a4452ec4800182ff64c86cf96c071b021311610a0bcc4eadf073fd1edd67922 not found: ID does not exist" Apr 24 22:13:29.839767 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.839749 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c9dd679db-q64wc"] Apr 24 22:13:29.878182 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.878166 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:13:29.878258 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.878185 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:13:29.878258 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.878196 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wln9p\" (UniqueName: \"kubernetes.io/projected/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-kube-api-access-wln9p\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:13:29.878258 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:29.878206 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:13:30.819012 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:30.818979 2571 generic.go:358] "Generic (PLEG): container finished" podID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerID="f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa" exitCode=0 Apr 24 22:13:30.819426 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:30.819057 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerDied","Data":"f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa"} Apr 24 22:13:31.647347 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:31.647290 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" path="/var/lib/kubelet/pods/f66c71d6-aff1-494e-9edd-fcbd1a1de6d3/volumes" Apr 24 22:13:31.825467 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:31.825426 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerStarted","Data":"cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f"} Apr 24 22:13:31.825467 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:31.825475 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerStarted","Data":"8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f"} Apr 24 22:13:31.826001 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:31.825764 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:31.826001 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:31.825911 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:31.827317 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:31.827271 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:13:31.845618 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:31.845573 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podStartSLOduration=6.845560413 podStartE2EDuration="6.845560413s" podCreationTimestamp="2026-04-24 22:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:13:31.844625683 +0000 UTC m=+2812.692669811" watchObservedRunningTime="2026-04-24 22:13:31.845560413 +0000 UTC m=+2812.693604528" Apr 24 22:13:32.828979 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:32.828934 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:13:37.835029 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:37.834998 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:13:37.835546 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:37.835520 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:13:47.836009 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:47.835965 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:13:57.835751 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:13:57.835707 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:14:07.835521 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:07.835481 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:14:17.835892 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:17.835849 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:14:27.836480 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:27.836441 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:14:37.836468 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:37.836437 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:14:45.879526 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.879492 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg"] Apr 24 22:14:45.880164 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.879839 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" containerID="cri-o://8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f" gracePeriod=30 Apr 24 22:14:45.880164 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.879904 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kube-rbac-proxy" containerID="cri-o://cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f" gracePeriod=30 Apr 24 22:14:45.956336 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956288 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd"] Apr 24 22:14:45.956578 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956565 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="storage-initializer" Apr 24 22:14:45.956578 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956579 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="storage-initializer" Apr 24 22:14:45.956675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956586 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" Apr 24 22:14:45.956675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956591 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" Apr 24 22:14:45.956675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956612 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kube-rbac-proxy" Apr 24 22:14:45.956675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956619 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kube-rbac-proxy" Apr 24 22:14:45.956675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956656 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kserve-container" Apr 24 22:14:45.956675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.956666 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f66c71d6-aff1-494e-9edd-fcbd1a1de6d3" containerName="kube-rbac-proxy" Apr 24 22:14:45.959750 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.959732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:45.962223 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.962203 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 24 22:14:45.962491 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.962473 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 24 22:14:45.971269 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:45.971248 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd"] Apr 24 22:14:46.035023 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.034991 2571 generic.go:358] "Generic (PLEG): container finished" podID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerID="cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f" exitCode=2 Apr 24 22:14:46.035157 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.035028 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerDied","Data":"cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f"} Apr 24 22:14:46.087379 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.087352 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fchhw\" (UniqueName: \"kubernetes.io/projected/cfcf9344-fd1a-4dce-9e25-0217bd467730-kube-api-access-fchhw\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.087522 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.087467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcf9344-fd1a-4dce-9e25-0217bd467730-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.087522 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.087510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfcf9344-fd1a-4dce-9e25-0217bd467730-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.087618 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.087558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfcf9344-fd1a-4dce-9e25-0217bd467730-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.188544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.188458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fchhw\" (UniqueName: \"kubernetes.io/projected/cfcf9344-fd1a-4dce-9e25-0217bd467730-kube-api-access-fchhw\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.188544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.188527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcf9344-fd1a-4dce-9e25-0217bd467730-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.188739 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.188548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfcf9344-fd1a-4dce-9e25-0217bd467730-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.188739 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.188584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfcf9344-fd1a-4dce-9e25-0217bd467730-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.189027 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.189008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfcf9344-fd1a-4dce-9e25-0217bd467730-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.189242 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.189219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfcf9344-fd1a-4dce-9e25-0217bd467730-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.191162 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.191142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcf9344-fd1a-4dce-9e25-0217bd467730-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.201595 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.201570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fchhw\" (UniqueName: \"kubernetes.io/projected/cfcf9344-fd1a-4dce-9e25-0217bd467730-kube-api-access-fchhw\") pod \"isvc-tensorflow-predictor-6756f669d7-24gzd\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.270378 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.270335 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:46.394832 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:46.394794 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd"] Apr 24 22:14:46.398484 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:14:46.398456 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcf9344_fd1a_4dce_9e25_0217bd467730.slice/crio-0dac135b92ef638178cc0a9759bff9c9baf951fe6867010259799bbd7054316f WatchSource:0}: Error finding container 0dac135b92ef638178cc0a9759bff9c9baf951fe6867010259799bbd7054316f: Status 404 returned error can't find the container with id 0dac135b92ef638178cc0a9759bff9c9baf951fe6867010259799bbd7054316f Apr 24 22:14:47.039212 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:47.039178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerStarted","Data":"2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7"} Apr 24 22:14:47.039212 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:47.039215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerStarted","Data":"0dac135b92ef638178cc0a9759bff9c9baf951fe6867010259799bbd7054316f"} Apr 24 22:14:47.829133 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:47.829079 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.49:8643/healthz\": dial tcp 10.134.0.49:8643: connect: connection refused" Apr 24 22:14:47.836283 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:47.836248 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 22:14:50.658551 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.658525 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:14:50.725148 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.725069 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " Apr 24 22:14:50.725148 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.725106 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg9ql\" (UniqueName: \"kubernetes.io/projected/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kube-api-access-xg9ql\") pod \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " Apr 24 22:14:50.725148 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.725143 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kserve-provision-location\") pod \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " Apr 24 22:14:50.725448 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.725169 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls\") pod \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\" (UID: \"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8\") " Apr 24 22:14:50.725516 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.725478 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" (UID: "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:50.725516 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.725483 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" (UID: "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:14:50.727369 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.727345 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" (UID: "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:14:50.727470 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.727431 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kube-api-access-xg9ql" (OuterVolumeSpecName: "kube-api-access-xg9ql") pod "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" (UID: "5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8"). InnerVolumeSpecName "kube-api-access-xg9ql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:14:50.826243 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.826198 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:14:50.826243 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.826231 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xg9ql\" (UniqueName: \"kubernetes.io/projected/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kube-api-access-xg9ql\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:14:50.826243 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.826242 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:14:50.826243 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:50.826251 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:14:51.051536 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.051499 2571 generic.go:358] "Generic (PLEG): container finished" podID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerID="2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7" exitCode=0 Apr 24 22:14:51.051714 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.051572 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerDied","Data":"2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7"} Apr 24 22:14:51.053387 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.053364 2571 generic.go:358] "Generic (PLEG): container finished" podID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerID="8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f" exitCode=0 Apr 24 22:14:51.053492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.053416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerDied","Data":"8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f"} Apr 24 22:14:51.053492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.053439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" event={"ID":"5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8","Type":"ContainerDied","Data":"e53920a53bb8ce40f0d9ee14150eecc3af7898e19451ffd95928f6624f586135"} Apr 24 22:14:51.053492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.053450 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg" Apr 24 22:14:51.053492 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.053458 2571 scope.go:117] "RemoveContainer" containerID="cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f" Apr 24 22:14:51.061590 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.061573 2571 scope.go:117] "RemoveContainer" containerID="8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f" Apr 24 22:14:51.068489 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.068473 2571 scope.go:117] "RemoveContainer" containerID="f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa" Apr 24 22:14:51.075712 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.075697 2571 scope.go:117] "RemoveContainer" containerID="cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f" Apr 24 22:14:51.075983 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:14:51.075965 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f\": container with ID starting with cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f not found: ID does not exist" containerID="cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f" Apr 24 22:14:51.076038 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.075993 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f"} err="failed to get container status \"cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f\": rpc error: code = NotFound desc = could not find container \"cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f\": container with ID starting with cc64b650f43d42d537890615588408b0efaa837dde9b77545be722c40a32360f not found: ID does not exist" Apr 24 22:14:51.076038 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.076011 2571 scope.go:117] "RemoveContainer" containerID="8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f" Apr 24 22:14:51.076255 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:14:51.076235 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f\": container with ID starting with 8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f not found: ID does not exist" containerID="8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f" Apr 24 22:14:51.076347 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.076264 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f"} err="failed to get container status \"8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f\": rpc error: code = NotFound desc = could not find container \"8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f\": container with ID starting with 8259d28dfee95e4430adbf5ff985bb3ff9efd33760f216bc1ab055da40fe6f3f not found: ID does not exist" Apr 24 22:14:51.076347 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.076289 2571 scope.go:117] "RemoveContainer" containerID="f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa" Apr 24 22:14:51.076562 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:14:51.076538 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa\": container with ID starting with f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa not found: ID does not exist" containerID="f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa" Apr 24 22:14:51.076630 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.076568 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa"} err="failed to get container status \"f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa\": rpc error: code = NotFound desc = could not find container \"f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa\": container with ID starting with f192b2557e4403e7160c54d26bdd8e7817d80581dacf7777d52b73fb178b03fa not found: ID does not exist" Apr 24 22:14:51.109815 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.109793 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg"] Apr 24 22:14:51.118860 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.118840 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5d8dfb54c-n42jg"] Apr 24 22:14:51.649791 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:51.649754 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" path="/var/lib/kubelet/pods/5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8/volumes" Apr 24 22:14:55.068780 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:55.068681 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerStarted","Data":"16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9"} Apr 24 22:14:55.068780 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:55.068723 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerStarted","Data":"c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1"} Apr 24 22:14:55.069175 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:55.069068 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:55.069175 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:55.069092 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:14:55.070382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:55.070355 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 22:14:55.091878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:55.091831 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podStartSLOduration=6.392553687 podStartE2EDuration="10.091819185s" podCreationTimestamp="2026-04-24 22:14:45 +0000 UTC" firstStartedPulling="2026-04-24 22:14:51.052820433 +0000 UTC m=+2891.900864538" lastFinishedPulling="2026-04-24 22:14:54.75208594 +0000 UTC m=+2895.600130036" observedRunningTime="2026-04-24 22:14:55.091047215 +0000 UTC m=+2895.939091355" watchObservedRunningTime="2026-04-24 22:14:55.091819185 +0000 UTC m=+2895.939863300" Apr 24 22:14:56.072040 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:14:56.071998 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 22:15:01.077798 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:01.077770 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:15:01.078387 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:01.078362 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 22:15:11.079402 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:11.079375 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:15:26.469715 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.469680 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd"] Apr 24 22:15:26.470282 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.470041 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kserve-container" containerID="cri-o://c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1" gracePeriod=30 Apr 24 22:15:26.470282 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.470112 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" containerID="cri-o://16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9" gracePeriod=30 Apr 24 22:15:26.549672 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.549642 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm"] Apr 24 22:15:26.549924 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.549913 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="storage-initializer" Apr 24 22:15:26.549971 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.549926 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="storage-initializer" Apr 24 22:15:26.549971 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.549934 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" Apr 24 22:15:26.549971 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.549939 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" Apr 24 22:15:26.549971 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.549950 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kube-rbac-proxy" Apr 24 22:15:26.549971 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.549956 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kube-rbac-proxy" Apr 24 22:15:26.550120 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.550009 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kserve-container" Apr 24 22:15:26.550120 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.550018 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f3bcb9a-d72a-489e-b7eb-eab4e0de9eb8" containerName="kube-rbac-proxy" Apr 24 22:15:26.553147 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.553125 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.555739 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.555715 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 24 22:15:26.555871 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.555786 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:15:26.565063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.565038 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm"] Apr 24 22:15:26.710072 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.710042 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb8cj\" (UniqueName: \"kubernetes.io/projected/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kube-api-access-gb8cj\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.710072 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.710075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.710266 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.710118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.710266 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.710141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.810624 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.810596 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.810794 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.810636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.810887 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.810864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gb8cj\" (UniqueName: \"kubernetes.io/projected/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kube-api-access-gb8cj\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.810973 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.810908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.811038 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.811015 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.811466 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.811448 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.813071 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.813055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.821179 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.821158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb8cj\" (UniqueName: \"kubernetes.io/projected/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kube-api-access-gb8cj\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.863050 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.863024 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:26.991837 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:26.991711 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm"] Apr 24 22:15:26.994439 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:15:26.994412 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff602ec4_eabb_408e_89ea_529e1ee8eeb5.slice/crio-d34eae537a0f636ea4d2e685f06f7f59d8eafa3a8f969e57cae678ee9fc5dbdc WatchSource:0}: Error finding container d34eae537a0f636ea4d2e685f06f7f59d8eafa3a8f969e57cae678ee9fc5dbdc: Status 404 returned error can't find the container with id d34eae537a0f636ea4d2e685f06f7f59d8eafa3a8f969e57cae678ee9fc5dbdc Apr 24 22:15:27.166928 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:27.166828 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerStarted","Data":"4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df"} Apr 24 22:15:27.166928 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:27.166874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerStarted","Data":"d34eae537a0f636ea4d2e685f06f7f59d8eafa3a8f969e57cae678ee9fc5dbdc"} Apr 24 22:15:27.168987 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:27.168962 2571 generic.go:358] "Generic (PLEG): container finished" podID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerID="16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9" exitCode=2 Apr 24 22:15:27.169082 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:27.169023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerDied","Data":"16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9"} Apr 24 22:15:31.073122 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:31.073071 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 22:15:32.182255 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:32.182214 2571 generic.go:358] "Generic (PLEG): container finished" podID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerID="4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df" exitCode=0 Apr 24 22:15:32.182689 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:32.182270 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerDied","Data":"4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df"} Apr 24 22:15:33.186654 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:33.186617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerStarted","Data":"2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638"} Apr 24 22:15:33.186654 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:33.186657 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerStarted","Data":"12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522"} Apr 24 22:15:33.187067 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:33.186946 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:33.187067 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:33.187056 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:33.188008 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:33.187984 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 22:15:33.206151 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:33.206102 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podStartSLOduration=7.206087685 podStartE2EDuration="7.206087685s" podCreationTimestamp="2026-04-24 22:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:15:33.204503035 +0000 UTC m=+2934.052547150" watchObservedRunningTime="2026-04-24 22:15:33.206087685 +0000 UTC m=+2934.054131795" Apr 24 22:15:34.189575 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:34.189528 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 22:15:36.072334 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:36.072272 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 22:15:39.194180 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:39.194150 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:39.194916 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:39.194772 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 22:15:41.072990 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:41.072940 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 22:15:41.073394 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:41.073116 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:15:46.072754 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:46.072695 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 22:15:49.194855 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:49.194825 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:15:51.072701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:51.072655 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 22:15:56.072718 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:56.072673 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 22:15:57.104185 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.104163 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:15:57.130089 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.130055 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfcf9344-fd1a-4dce-9e25-0217bd467730-kserve-provision-location\") pod \"cfcf9344-fd1a-4dce-9e25-0217bd467730\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " Apr 24 22:15:57.130254 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.130119 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fchhw\" (UniqueName: \"kubernetes.io/projected/cfcf9344-fd1a-4dce-9e25-0217bd467730-kube-api-access-fchhw\") pod \"cfcf9344-fd1a-4dce-9e25-0217bd467730\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " Apr 24 22:15:57.130254 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.130156 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcf9344-fd1a-4dce-9e25-0217bd467730-proxy-tls\") pod \"cfcf9344-fd1a-4dce-9e25-0217bd467730\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " Apr 24 22:15:57.130254 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.130204 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfcf9344-fd1a-4dce-9e25-0217bd467730-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"cfcf9344-fd1a-4dce-9e25-0217bd467730\" (UID: \"cfcf9344-fd1a-4dce-9e25-0217bd467730\") " Apr 24 22:15:57.130685 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.130656 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcf9344-fd1a-4dce-9e25-0217bd467730-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "cfcf9344-fd1a-4dce-9e25-0217bd467730" (UID: "cfcf9344-fd1a-4dce-9e25-0217bd467730"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:15:57.132406 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.132378 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcf9344-fd1a-4dce-9e25-0217bd467730-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cfcf9344-fd1a-4dce-9e25-0217bd467730" (UID: "cfcf9344-fd1a-4dce-9e25-0217bd467730"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:15:57.132513 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.132378 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcf9344-fd1a-4dce-9e25-0217bd467730-kube-api-access-fchhw" (OuterVolumeSpecName: "kube-api-access-fchhw") pod "cfcf9344-fd1a-4dce-9e25-0217bd467730" (UID: "cfcf9344-fd1a-4dce-9e25-0217bd467730"). InnerVolumeSpecName "kube-api-access-fchhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:15:57.145793 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.145762 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfcf9344-fd1a-4dce-9e25-0217bd467730-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cfcf9344-fd1a-4dce-9e25-0217bd467730" (UID: "cfcf9344-fd1a-4dce-9e25-0217bd467730"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:15:57.231004 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.230980 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfcf9344-fd1a-4dce-9e25-0217bd467730-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:15:57.231004 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.231007 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fchhw\" (UniqueName: \"kubernetes.io/projected/cfcf9344-fd1a-4dce-9e25-0217bd467730-kube-api-access-fchhw\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:15:57.231174 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.231023 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcf9344-fd1a-4dce-9e25-0217bd467730-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:15:57.231174 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.231035 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfcf9344-fd1a-4dce-9e25-0217bd467730-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:15:57.255974 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.255944 2571 generic.go:358] "Generic (PLEG): container finished" podID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerID="c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1" exitCode=137 Apr 24 22:15:57.256110 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.255980 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerDied","Data":"c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1"} Apr 24 22:15:57.256110 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.256019 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" Apr 24 22:15:57.256110 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.256026 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd" event={"ID":"cfcf9344-fd1a-4dce-9e25-0217bd467730","Type":"ContainerDied","Data":"0dac135b92ef638178cc0a9759bff9c9baf951fe6867010259799bbd7054316f"} Apr 24 22:15:57.256110 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.256045 2571 scope.go:117] "RemoveContainer" containerID="16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9" Apr 24 22:15:57.264130 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.264116 2571 scope.go:117] "RemoveContainer" containerID="c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1" Apr 24 22:15:57.270815 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.270798 2571 scope.go:117] "RemoveContainer" containerID="2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7" Apr 24 22:15:57.277092 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.277068 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd"] Apr 24 22:15:57.278120 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.278108 2571 scope.go:117] "RemoveContainer" containerID="16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9" Apr 24 22:15:57.278395 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:15:57.278377 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9\": container with ID starting with 16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9 not found: ID does not exist" containerID="16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9" Apr 24 22:15:57.278439 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.278406 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9"} err="failed to get container status \"16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9\": rpc error: code = NotFound desc = could not find container \"16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9\": container with ID starting with 16d5d567c92dc90dfc600503932506d3594a3c1a2d1448aac8f1e7a1298bead9 not found: ID does not exist" Apr 24 22:15:57.278439 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.278437 2571 scope.go:117] "RemoveContainer" containerID="c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1" Apr 24 22:15:57.278690 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:15:57.278673 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1\": container with ID starting with c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1 not found: ID does not exist" containerID="c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1" Apr 24 22:15:57.278734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.278696 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1"} err="failed to get container status \"c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1\": rpc error: code = NotFound desc = could not find container \"c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1\": container with ID starting with c339b572427a448e3cc64aee9f6ae0919e18fd54e3136ac344c40c58846f91d1 not found: ID does not exist" Apr 24 22:15:57.278734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.278712 2571 scope.go:117] "RemoveContainer" containerID="2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7" Apr 24 22:15:57.278946 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:15:57.278928 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7\": container with ID starting with 2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7 not found: ID does not exist" containerID="2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7" Apr 24 22:15:57.278988 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.278950 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7"} err="failed to get container status \"2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7\": rpc error: code = NotFound desc = could not find container \"2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7\": container with ID starting with 2753a6915e4ff0f1c1845dcac609a7131e1170ea4852c048ce09c0d7f7a4f5d7 not found: ID does not exist" Apr 24 22:15:57.282188 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.282165 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-24gzd"] Apr 24 22:15:57.647345 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:15:57.647254 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" path="/var/lib/kubelet/pods/cfcf9344-fd1a-4dce-9e25-0217bd467730/volumes" Apr 24 22:16:07.261709 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.261677 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm"] Apr 24 22:16:07.262195 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.262077 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kserve-container" containerID="cri-o://12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522" gracePeriod=30 Apr 24 22:16:07.262195 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.262129 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" containerID="cri-o://2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638" gracePeriod=30 Apr 24 22:16:07.354326 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354276 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx"] Apr 24 22:16:07.354601 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354588 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" Apr 24 22:16:07.354650 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354603 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" Apr 24 22:16:07.354650 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354618 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="storage-initializer" Apr 24 22:16:07.354650 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354623 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="storage-initializer" Apr 24 22:16:07.354650 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354634 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kserve-container" Apr 24 22:16:07.354650 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354639 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kserve-container" Apr 24 22:16:07.354802 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354689 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kserve-container" Apr 24 22:16:07.354802 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.354700 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfcf9344-fd1a-4dce-9e25-0217bd467730" containerName="kube-rbac-proxy" Apr 24 22:16:07.359349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.359330 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.361818 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.361797 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 24 22:16:07.361921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.361824 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 24 22:16:07.370252 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.370204 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx"] Apr 24 22:16:07.410652 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.410566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98c7d16f-cac9-4e89-8462-854616321bc9-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.410652 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.410607 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.410652 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.410638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxh95\" (UniqueName: \"kubernetes.io/projected/98c7d16f-cac9-4e89-8462-854616321bc9-kube-api-access-hxh95\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.410848 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.410665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c7d16f-cac9-4e89-8462-854616321bc9-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.510976 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.510946 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98c7d16f-cac9-4e89-8462-854616321bc9-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.511123 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.510984 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.511123 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.511019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxh95\" (UniqueName: \"kubernetes.io/projected/98c7d16f-cac9-4e89-8462-854616321bc9-kube-api-access-hxh95\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.511123 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.511036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c7d16f-cac9-4e89-8462-854616321bc9-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.511278 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:16:07.511185 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 24 22:16:07.511278 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:16:07.511256 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls podName:98c7d16f-cac9-4e89-8462-854616321bc9 nodeName:}" failed. No retries permitted until 2026-04-24 22:16:08.011234938 +0000 UTC m=+2968.859279031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-5qstx" (UID: "98c7d16f-cac9-4e89-8462-854616321bc9") : secret "isvc-triton-predictor-serving-cert" not found Apr 24 22:16:07.511463 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.511445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c7d16f-cac9-4e89-8462-854616321bc9-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.511668 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.511649 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98c7d16f-cac9-4e89-8462-854616321bc9-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:07.519446 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:07.519423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxh95\" (UniqueName: \"kubernetes.io/projected/98c7d16f-cac9-4e89-8462-854616321bc9-kube-api-access-hxh95\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:08.014775 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:08.014743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:08.017188 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:08.017157 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-5qstx\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:08.271608 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:08.271538 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:16:08.288592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:08.288558 2571 generic.go:358] "Generic (PLEG): container finished" podID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerID="2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638" exitCode=2 Apr 24 22:16:08.288703 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:08.288622 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerDied","Data":"2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638"} Apr 24 22:16:08.397735 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:08.397711 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx"] Apr 24 22:16:08.400034 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:16:08.400006 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c7d16f_cac9_4e89_8462_854616321bc9.slice/crio-2ba4bda0af17f09cac129dd150e4e5b874efdffac5cf12daf63caff9b7450135 WatchSource:0}: Error finding container 2ba4bda0af17f09cac129dd150e4e5b874efdffac5cf12daf63caff9b7450135: Status 404 returned error can't find the container with id 2ba4bda0af17f09cac129dd150e4e5b874efdffac5cf12daf63caff9b7450135 Apr 24 22:16:09.190611 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:09.190570 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 22:16:09.293428 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:09.293394 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerStarted","Data":"7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a"} Apr 24 22:16:09.293797 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:09.293435 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerStarted","Data":"2ba4bda0af17f09cac129dd150e4e5b874efdffac5cf12daf63caff9b7450135"} Apr 24 22:16:12.303312 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:12.303265 2571 generic.go:358] "Generic (PLEG): container finished" podID="98c7d16f-cac9-4e89-8462-854616321bc9" containerID="7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a" exitCode=0 Apr 24 22:16:12.303737 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:12.303330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerDied","Data":"7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a"} Apr 24 22:16:14.190188 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:14.190141 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 22:16:19.190950 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:19.190897 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 22:16:19.191478 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:19.191048 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:16:24.189756 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:24.189709 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 22:16:29.190006 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:29.189947 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 22:16:34.190104 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:34.190055 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 22:16:37.961744 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:37.961719 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:16:38.000490 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.000457 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kserve-provision-location\") pod \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " Apr 24 22:16:38.000685 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.000514 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-proxy-tls\") pod \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " Apr 24 22:16:38.000685 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.000607 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb8cj\" (UniqueName: \"kubernetes.io/projected/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kube-api-access-gb8cj\") pod \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " Apr 24 22:16:38.000685 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.000641 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\" (UID: \"ff602ec4-eabb-408e-89ea-529e1ee8eeb5\") " Apr 24 22:16:38.001161 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.001118 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "ff602ec4-eabb-408e-89ea-529e1ee8eeb5" (UID: "ff602ec4-eabb-408e-89ea-529e1ee8eeb5"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:38.005406 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.005366 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kube-api-access-gb8cj" (OuterVolumeSpecName: "kube-api-access-gb8cj") pod "ff602ec4-eabb-408e-89ea-529e1ee8eeb5" (UID: "ff602ec4-eabb-408e-89ea-529e1ee8eeb5"). InnerVolumeSpecName "kube-api-access-gb8cj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:38.005406 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.005403 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ff602ec4-eabb-408e-89ea-529e1ee8eeb5" (UID: "ff602ec4-eabb-408e-89ea-529e1ee8eeb5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:38.008787 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.008758 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ff602ec4-eabb-408e-89ea-529e1ee8eeb5" (UID: "ff602ec4-eabb-408e-89ea-529e1ee8eeb5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:38.101933 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.101841 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gb8cj\" (UniqueName: \"kubernetes.io/projected/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kube-api-access-gb8cj\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:16:38.101933 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.101881 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:16:38.101933 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.101899 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:16:38.101933 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.101913 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff602ec4-eabb-408e-89ea-529e1ee8eeb5-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:16:38.410267 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.410184 2571 generic.go:358] "Generic (PLEG): container finished" podID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerID="12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522" exitCode=137 Apr 24 22:16:38.410267 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.410233 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerDied","Data":"12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522"} Apr 24 22:16:38.410528 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.410270 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" event={"ID":"ff602ec4-eabb-408e-89ea-529e1ee8eeb5","Type":"ContainerDied","Data":"d34eae537a0f636ea4d2e685f06f7f59d8eafa3a8f969e57cae678ee9fc5dbdc"} Apr 24 22:16:38.410528 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.410312 2571 scope.go:117] "RemoveContainer" containerID="2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638" Apr 24 22:16:38.410528 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.410344 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm" Apr 24 22:16:38.421203 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.421185 2571 scope.go:117] "RemoveContainer" containerID="12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522" Apr 24 22:16:38.431755 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.431731 2571 scope.go:117] "RemoveContainer" containerID="4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df" Apr 24 22:16:38.439500 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.439452 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm"] Apr 24 22:16:38.441991 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.441968 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-jb9rm"] Apr 24 22:16:38.443889 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.443868 2571 scope.go:117] "RemoveContainer" containerID="2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638" Apr 24 22:16:38.444473 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:16:38.444447 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638\": container with ID starting with 2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638 not found: ID does not exist" containerID="2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638" Apr 24 22:16:38.444568 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.444484 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638"} err="failed to get container status \"2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638\": rpc error: code = NotFound desc = could not find container \"2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638\": container with ID starting with 2217bc19473abc77c488b88cf4883cc176dd3d021795480d99054ab0a28de638 not found: ID does not exist" Apr 24 22:16:38.444568 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.444512 2571 scope.go:117] "RemoveContainer" containerID="12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522" Apr 24 22:16:38.444827 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:16:38.444800 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522\": container with ID starting with 12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522 not found: ID does not exist" containerID="12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522" Apr 24 22:16:38.444940 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.444836 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522"} err="failed to get container status \"12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522\": rpc error: code = NotFound desc = could not find container \"12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522\": container with ID starting with 12a1f80a8a33656e8c76f587eb5ea7bc9089e2dc8b4b17d00e6efa39461d9522 not found: ID does not exist" Apr 24 22:16:38.444940 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.444858 2571 scope.go:117] "RemoveContainer" containerID="4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df" Apr 24 22:16:38.445220 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:16:38.445156 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df\": container with ID starting with 4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df not found: ID does not exist" containerID="4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df" Apr 24 22:16:38.445220 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:38.445192 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df"} err="failed to get container status \"4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df\": rpc error: code = NotFound desc = could not find container \"4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df\": container with ID starting with 4138ad4c655e845e3c72247d08f04452722449fdd31d60f6b0cc3bd68e84f6df not found: ID does not exist" Apr 24 22:16:39.653325 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:16:39.650929 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" path="/var/lib/kubelet/pods/ff602ec4-eabb-408e-89ea-529e1ee8eeb5/volumes" Apr 24 22:18:07.679346 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:07.679290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerStarted","Data":"e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5"} Apr 24 22:18:07.679346 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:07.679349 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerStarted","Data":"60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066"} Apr 24 22:18:07.679828 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:07.679547 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:18:07.679828 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:07.679699 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:18:07.680755 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:07.680729 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 22:18:07.702097 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:07.702047 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" podStartSLOduration=5.967736344 podStartE2EDuration="2m0.702028885s" podCreationTimestamp="2026-04-24 22:16:07 +0000 UTC" firstStartedPulling="2026-04-24 22:16:12.304485425 +0000 UTC m=+2973.152529519" lastFinishedPulling="2026-04-24 22:18:07.038777962 +0000 UTC m=+3087.886822060" observedRunningTime="2026-04-24 22:18:07.699560827 +0000 UTC m=+3088.547604942" watchObservedRunningTime="2026-04-24 22:18:07.702028885 +0000 UTC m=+3088.550073004" Apr 24 22:18:08.682023 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:08.681975 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 22:18:13.686392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:13.686358 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:18:13.687090 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:13.687072 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:18:19.789683 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.789649 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx"] Apr 24 22:18:19.790260 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.789961 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kserve-container" containerID="cri-o://60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066" gracePeriod=30 Apr 24 22:18:19.790260 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.790035 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kube-rbac-proxy" containerID="cri-o://e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5" gracePeriod=30 Apr 24 22:18:19.909174 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909132 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9"] Apr 24 22:18:19.909480 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909467 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kserve-container" Apr 24 22:18:19.909532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909482 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kserve-container" Apr 24 22:18:19.909532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909493 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="storage-initializer" Apr 24 22:18:19.909532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909498 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="storage-initializer" Apr 24 22:18:19.909532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909517 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" Apr 24 22:18:19.909532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909522 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" Apr 24 22:18:19.909686 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909565 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kube-rbac-proxy" Apr 24 22:18:19.909686 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.909575 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff602ec4-eabb-408e-89ea-529e1ee8eeb5" containerName="kserve-container" Apr 24 22:18:19.926053 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.926025 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9"] Apr 24 22:18:19.926191 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.926148 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:19.928550 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.928529 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 24 22:18:19.928674 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:19.928567 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 22:18:20.084701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.084598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl6cd\" (UniqueName: \"kubernetes.io/projected/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kube-api-access-cl6cd\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.084701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.084643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.084701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.084663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0cfc683-b931-4dc5-ac45-a9f4c7232481-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.084989 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.084806 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0cfc683-b931-4dc5-ac45-a9f4c7232481-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.185424 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.185388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0cfc683-b931-4dc5-ac45-a9f4c7232481-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.185575 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.185466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl6cd\" (UniqueName: \"kubernetes.io/projected/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kube-api-access-cl6cd\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.185575 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.185491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.185575 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.185511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0cfc683-b931-4dc5-ac45-a9f4c7232481-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.185930 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.185906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.186373 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.186237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0cfc683-b931-4dc5-ac45-a9f4c7232481-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.188515 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.188492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0cfc683-b931-4dc5-ac45-a9f4c7232481-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.196817 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.196797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl6cd\" (UniqueName: \"kubernetes.io/projected/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kube-api-access-cl6cd\") pod \"isvc-xgboost-predictor-8689c4cfcc-p4qn9\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.236085 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.236052 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:20.356499 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.356434 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9"] Apr 24 22:18:20.359355 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:18:20.359325 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0cfc683_b931_4dc5_ac45_a9f4c7232481.slice/crio-d4b65ee4bdc7556bf059c1e8994f993a2e08a8b0f1304a9d5beb901669ad1d16 WatchSource:0}: Error finding container d4b65ee4bdc7556bf059c1e8994f993a2e08a8b0f1304a9d5beb901669ad1d16: Status 404 returned error can't find the container with id d4b65ee4bdc7556bf059c1e8994f993a2e08a8b0f1304a9d5beb901669ad1d16 Apr 24 22:18:20.713644 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.713560 2571 generic.go:358] "Generic (PLEG): container finished" podID="98c7d16f-cac9-4e89-8462-854616321bc9" containerID="e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5" exitCode=2 Apr 24 22:18:20.713785 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.713632 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerDied","Data":"e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5"} Apr 24 22:18:20.714828 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.714807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerStarted","Data":"c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29"} Apr 24 22:18:20.714947 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:20.714849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerStarted","Data":"d4b65ee4bdc7556bf059c1e8994f993a2e08a8b0f1304a9d5beb901669ad1d16"} Apr 24 22:18:23.017817 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.017796 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:18:23.109839 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.109798 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c7d16f-cac9-4e89-8462-854616321bc9-kserve-provision-location\") pod \"98c7d16f-cac9-4e89-8462-854616321bc9\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " Apr 24 22:18:23.110008 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.109886 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls\") pod \"98c7d16f-cac9-4e89-8462-854616321bc9\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " Apr 24 22:18:23.110008 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.109934 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98c7d16f-cac9-4e89-8462-854616321bc9-isvc-triton-kube-rbac-proxy-sar-config\") pod \"98c7d16f-cac9-4e89-8462-854616321bc9\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " Apr 24 22:18:23.110008 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.109967 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxh95\" (UniqueName: \"kubernetes.io/projected/98c7d16f-cac9-4e89-8462-854616321bc9-kube-api-access-hxh95\") pod \"98c7d16f-cac9-4e89-8462-854616321bc9\" (UID: \"98c7d16f-cac9-4e89-8462-854616321bc9\") " Apr 24 22:18:23.110230 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.110206 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c7d16f-cac9-4e89-8462-854616321bc9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "98c7d16f-cac9-4e89-8462-854616321bc9" (UID: "98c7d16f-cac9-4e89-8462-854616321bc9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:18:23.110275 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.110231 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c7d16f-cac9-4e89-8462-854616321bc9-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "98c7d16f-cac9-4e89-8462-854616321bc9" (UID: "98c7d16f-cac9-4e89-8462-854616321bc9"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:18:23.112241 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.112215 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "98c7d16f-cac9-4e89-8462-854616321bc9" (UID: "98c7d16f-cac9-4e89-8462-854616321bc9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:18:23.112241 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.112225 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c7d16f-cac9-4e89-8462-854616321bc9-kube-api-access-hxh95" (OuterVolumeSpecName: "kube-api-access-hxh95") pod "98c7d16f-cac9-4e89-8462-854616321bc9" (UID: "98c7d16f-cac9-4e89-8462-854616321bc9"). InnerVolumeSpecName "kube-api-access-hxh95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:18:23.211374 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.211340 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c7d16f-cac9-4e89-8462-854616321bc9-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:18:23.211374 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.211368 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98c7d16f-cac9-4e89-8462-854616321bc9-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:18:23.211374 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.211379 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxh95\" (UniqueName: \"kubernetes.io/projected/98c7d16f-cac9-4e89-8462-854616321bc9-kube-api-access-hxh95\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:18:23.211605 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.211388 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98c7d16f-cac9-4e89-8462-854616321bc9-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:18:23.723614 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.723584 2571 generic.go:358] "Generic (PLEG): container finished" podID="98c7d16f-cac9-4e89-8462-854616321bc9" containerID="60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066" exitCode=0 Apr 24 22:18:23.723789 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.723625 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerDied","Data":"60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066"} Apr 24 22:18:23.723789 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.723659 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" event={"ID":"98c7d16f-cac9-4e89-8462-854616321bc9","Type":"ContainerDied","Data":"2ba4bda0af17f09cac129dd150e4e5b874efdffac5cf12daf63caff9b7450135"} Apr 24 22:18:23.723789 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.723678 2571 scope.go:117] "RemoveContainer" containerID="e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5" Apr 24 22:18:23.723789 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.723720 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx" Apr 24 22:18:23.734102 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.734079 2571 scope.go:117] "RemoveContainer" containerID="60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066" Apr 24 22:18:23.741075 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.741053 2571 scope.go:117] "RemoveContainer" containerID="7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a" Apr 24 22:18:23.744161 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.744140 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx"] Apr 24 22:18:23.748030 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.748010 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-5qstx"] Apr 24 22:18:23.748233 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.748216 2571 scope.go:117] "RemoveContainer" containerID="e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5" Apr 24 22:18:23.748503 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:18:23.748484 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5\": container with ID starting with e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5 not found: ID does not exist" containerID="e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5" Apr 24 22:18:23.748555 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.748511 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5"} err="failed to get container status \"e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5\": rpc error: code = NotFound desc = could not find container \"e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5\": container with ID starting with e163d2a7a331a94e3893b58eb35a1f19b9f91879ddd3cad71a651efb66bcf9b5 not found: ID does not exist" Apr 24 22:18:23.748555 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.748529 2571 scope.go:117] "RemoveContainer" containerID="60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066" Apr 24 22:18:23.748783 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:18:23.748768 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066\": container with ID starting with 60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066 not found: ID does not exist" containerID="60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066" Apr 24 22:18:23.748821 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.748790 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066"} err="failed to get container status \"60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066\": rpc error: code = NotFound desc = could not find container \"60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066\": container with ID starting with 60a07fd1502b11d0e8ae6581d1152946d7943beb34a9ef0900846f526e0f1066 not found: ID does not exist" Apr 24 22:18:23.748821 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.748808 2571 scope.go:117] "RemoveContainer" containerID="7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a" Apr 24 22:18:23.749010 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:18:23.748994 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a\": container with ID starting with 7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a not found: ID does not exist" containerID="7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a" Apr 24 22:18:23.749048 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:23.749016 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a"} err="failed to get container status \"7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a\": rpc error: code = NotFound desc = could not find container \"7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a\": container with ID starting with 7543a74f6b9044554ac826bb8b988636a1b68857ee71602495473024767e416a not found: ID does not exist" Apr 24 22:18:24.727660 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:24.727628 2571 generic.go:358] "Generic (PLEG): container finished" podID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerID="c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29" exitCode=0 Apr 24 22:18:24.728039 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:24.727682 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerDied","Data":"c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29"} Apr 24 22:18:25.648181 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:25.648147 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" path="/var/lib/kubelet/pods/98c7d16f-cac9-4e89-8462-854616321bc9/volumes" Apr 24 22:18:46.509844 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:46.509825 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:18:46.791902 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:46.791870 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerStarted","Data":"093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab"} Apr 24 22:18:46.791902 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:46.791903 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerStarted","Data":"da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d"} Apr 24 22:18:46.792107 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:46.792096 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:46.813657 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:46.813602 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podStartSLOduration=6.161448774 podStartE2EDuration="27.813583995s" podCreationTimestamp="2026-04-24 22:18:19 +0000 UTC" firstStartedPulling="2026-04-24 22:18:24.728867253 +0000 UTC m=+3105.576911347" lastFinishedPulling="2026-04-24 22:18:46.381002471 +0000 UTC m=+3127.229046568" observedRunningTime="2026-04-24 22:18:46.811757553 +0000 UTC m=+3127.659801668" watchObservedRunningTime="2026-04-24 22:18:46.813583995 +0000 UTC m=+3127.661628117" Apr 24 22:18:47.794382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:47.794348 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:47.795650 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:47.795618 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:18:48.796638 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:48.796596 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:18:53.801221 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:53.801194 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:18:53.801681 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:18:53.801647 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:19:03.802119 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:19:03.802029 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:19:13.801958 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:19:13.801913 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:19:23.802404 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:19:23.802360 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:19:33.802014 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:19:33.801970 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:19:43.801795 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:19:43.801745 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 22:19:53.802463 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:19:53.802428 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:20:00.017834 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.017792 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9"] Apr 24 22:20:00.018279 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.018203 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" containerID="cri-o://da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d" gracePeriod=30 Apr 24 22:20:00.018395 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.018363 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kube-rbac-proxy" containerID="cri-o://093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab" gracePeriod=30 Apr 24 22:20:00.114976 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.114939 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc"] Apr 24 22:20:00.115264 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115249 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="storage-initializer" Apr 24 22:20:00.115338 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115267 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="storage-initializer" Apr 24 22:20:00.115338 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115277 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kserve-container" Apr 24 22:20:00.115338 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115282 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kserve-container" Apr 24 22:20:00.115338 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115311 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kube-rbac-proxy" Apr 24 22:20:00.115338 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115318 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kube-rbac-proxy" Apr 24 22:20:00.115499 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115370 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kserve-container" Apr 24 22:20:00.115499 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.115380 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="98c7d16f-cac9-4e89-8462-854616321bc9" containerName="kube-rbac-proxy" Apr 24 22:20:00.118518 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.118499 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.121163 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.121142 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 22:20:00.121358 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.121338 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:20:00.130551 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.130530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc"] Apr 24 22:20:00.204818 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.204774 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6a84850-46be-46f3-ace8-6048a654b706-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.205004 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.204830 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7ns6\" (UniqueName: \"kubernetes.io/projected/e6a84850-46be-46f3-ace8-6048a654b706-kube-api-access-v7ns6\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.205004 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.204900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6a84850-46be-46f3-ace8-6048a654b706-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.205004 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.204961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6a84850-46be-46f3-ace8-6048a654b706-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.306254 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.306165 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6a84850-46be-46f3-ace8-6048a654b706-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.306443 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.306252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6a84850-46be-46f3-ace8-6048a654b706-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.306443 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.306323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6a84850-46be-46f3-ace8-6048a654b706-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.306443 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.306355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7ns6\" (UniqueName: \"kubernetes.io/projected/e6a84850-46be-46f3-ace8-6048a654b706-kube-api-access-v7ns6\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.306667 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.306652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6a84850-46be-46f3-ace8-6048a654b706-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.306993 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.306977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6a84850-46be-46f3-ace8-6048a654b706-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.308940 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.308922 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6a84850-46be-46f3-ace8-6048a654b706-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.314986 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.314963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7ns6\" (UniqueName: \"kubernetes.io/projected/e6a84850-46be-46f3-ace8-6048a654b706-kube-api-access-v7ns6\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.429580 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.429541 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:00.612310 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.612269 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc"] Apr 24 22:20:00.614470 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:20:00.614428 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a84850_46be_46f3_ace8_6048a654b706.slice/crio-517cba52228a68b21ff2f9ad2d21034a6b61924b9befff31defd31e0a9a7d2b6 WatchSource:0}: Error finding container 517cba52228a68b21ff2f9ad2d21034a6b61924b9befff31defd31e0a9a7d2b6: Status 404 returned error can't find the container with id 517cba52228a68b21ff2f9ad2d21034a6b61924b9befff31defd31e0a9a7d2b6 Apr 24 22:20:00.999397 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.999360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerStarted","Data":"5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb"} Apr 24 22:20:00.999397 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:00.999402 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerStarted","Data":"517cba52228a68b21ff2f9ad2d21034a6b61924b9befff31defd31e0a9a7d2b6"} Apr 24 22:20:01.001151 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:01.001123 2571 generic.go:358] "Generic (PLEG): container finished" podID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerID="093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab" exitCode=2 Apr 24 22:20:01.001349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:01.001174 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerDied","Data":"093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab"} Apr 24 22:20:03.645815 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.645790 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:20:03.740169 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.740084 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0cfc683-b931-4dc5-ac45-a9f4c7232481-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " Apr 24 22:20:03.740169 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.740133 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl6cd\" (UniqueName: \"kubernetes.io/projected/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kube-api-access-cl6cd\") pod \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " Apr 24 22:20:03.740169 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.740164 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0cfc683-b931-4dc5-ac45-a9f4c7232481-proxy-tls\") pod \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " Apr 24 22:20:03.740485 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.740210 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kserve-provision-location\") pod \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\" (UID: \"e0cfc683-b931-4dc5-ac45-a9f4c7232481\") " Apr 24 22:20:03.740589 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.740556 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cfc683-b931-4dc5-ac45-a9f4c7232481-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "e0cfc683-b931-4dc5-ac45-a9f4c7232481" (UID: "e0cfc683-b931-4dc5-ac45-a9f4c7232481"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:20:03.741090 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.740886 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e0cfc683-b931-4dc5-ac45-a9f4c7232481" (UID: "e0cfc683-b931-4dc5-ac45-a9f4c7232481"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:20:03.746716 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.746684 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kube-api-access-cl6cd" (OuterVolumeSpecName: "kube-api-access-cl6cd") pod "e0cfc683-b931-4dc5-ac45-a9f4c7232481" (UID: "e0cfc683-b931-4dc5-ac45-a9f4c7232481"). InnerVolumeSpecName "kube-api-access-cl6cd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:20:03.746868 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.746848 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0cfc683-b931-4dc5-ac45-a9f4c7232481-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e0cfc683-b931-4dc5-ac45-a9f4c7232481" (UID: "e0cfc683-b931-4dc5-ac45-a9f4c7232481"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:20:03.841080 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.841049 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0cfc683-b931-4dc5-ac45-a9f4c7232481-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:03.841080 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.841079 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cl6cd\" (UniqueName: \"kubernetes.io/projected/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kube-api-access-cl6cd\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:03.841260 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.841090 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0cfc683-b931-4dc5-ac45-a9f4c7232481-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:03.841260 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:03.841100 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cfc683-b931-4dc5-ac45-a9f4c7232481-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:04.010964 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.010925 2571 generic.go:358] "Generic (PLEG): container finished" podID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerID="da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d" exitCode=0 Apr 24 22:20:04.011146 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.010988 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerDied","Data":"da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d"} Apr 24 22:20:04.011146 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.011024 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" event={"ID":"e0cfc683-b931-4dc5-ac45-a9f4c7232481","Type":"ContainerDied","Data":"d4b65ee4bdc7556bf059c1e8994f993a2e08a8b0f1304a9d5beb901669ad1d16"} Apr 24 22:20:04.011146 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.011031 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9" Apr 24 22:20:04.011146 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.011043 2571 scope.go:117] "RemoveContainer" containerID="093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab" Apr 24 22:20:04.019133 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.019114 2571 scope.go:117] "RemoveContainer" containerID="da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d" Apr 24 22:20:04.025796 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.025778 2571 scope.go:117] "RemoveContainer" containerID="c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29" Apr 24 22:20:04.032362 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.032341 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9"] Apr 24 22:20:04.032647 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.032626 2571 scope.go:117] "RemoveContainer" containerID="093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab" Apr 24 22:20:04.032893 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:04.032878 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab\": container with ID starting with 093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab not found: ID does not exist" containerID="093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab" Apr 24 22:20:04.032941 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.032900 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab"} err="failed to get container status \"093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab\": rpc error: code = NotFound desc = could not find container \"093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab\": container with ID starting with 093d66aceb3d85658cc4ec56ecb3d2fe1d40224a9274c94e84298f5ff4dd0dab not found: ID does not exist" Apr 24 22:20:04.032941 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.032920 2571 scope.go:117] "RemoveContainer" containerID="da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d" Apr 24 22:20:04.033139 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:04.033125 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d\": container with ID starting with da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d not found: ID does not exist" containerID="da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d" Apr 24 22:20:04.033184 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.033143 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d"} err="failed to get container status \"da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d\": rpc error: code = NotFound desc = could not find container \"da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d\": container with ID starting with da2ccbd26f29395d3a2bd57c40fd6f9d7c78087d5bb0a9679a2ff55b5f4f1a3d not found: ID does not exist" Apr 24 22:20:04.033184 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.033155 2571 scope.go:117] "RemoveContainer" containerID="c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29" Apr 24 22:20:04.033528 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:04.033494 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29\": container with ID starting with c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29 not found: ID does not exist" containerID="c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29" Apr 24 22:20:04.033675 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.033575 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29"} err="failed to get container status \"c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29\": rpc error: code = NotFound desc = could not find container \"c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29\": container with ID starting with c6b1de5ff323b661b1af207bc010fd870855bd6ef407497dfa00658bed444f29 not found: ID does not exist" Apr 24 22:20:04.035070 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:04.035051 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-p4qn9"] Apr 24 22:20:05.014996 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:05.014964 2571 generic.go:358] "Generic (PLEG): container finished" podID="e6a84850-46be-46f3-ace8-6048a654b706" containerID="5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb" exitCode=0 Apr 24 22:20:05.015538 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:05.015046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerDied","Data":"5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb"} Apr 24 22:20:05.647158 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:05.647121 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" path="/var/lib/kubelet/pods/e0cfc683-b931-4dc5-ac45-a9f4c7232481/volumes" Apr 24 22:20:06.020526 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:06.020489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerStarted","Data":"b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b"} Apr 24 22:20:06.020889 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:06.020535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerStarted","Data":"f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8"} Apr 24 22:20:06.020889 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:06.020743 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:06.044013 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:06.043939 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" podStartSLOduration=6.043925716 podStartE2EDuration="6.043925716s" podCreationTimestamp="2026-04-24 22:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:20:06.043045765 +0000 UTC m=+3206.891089895" watchObservedRunningTime="2026-04-24 22:20:06.043925716 +0000 UTC m=+3206.891969831" Apr 24 22:20:07.023574 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:07.023547 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:13.032456 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:13.032421 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:43.036135 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:43.036062 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:50.208412 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.208375 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc"] Apr 24 22:20:50.208915 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.208697 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kserve-container" containerID="cri-o://f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8" gracePeriod=30 Apr 24 22:20:50.208915 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.208731 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kube-rbac-proxy" containerID="cri-o://b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b" gracePeriod=30 Apr 24 22:20:50.283936 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.283905 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn"] Apr 24 22:20:50.284182 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284171 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="storage-initializer" Apr 24 22:20:50.284234 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284184 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="storage-initializer" Apr 24 22:20:50.284234 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284193 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kube-rbac-proxy" Apr 24 22:20:50.284234 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284198 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kube-rbac-proxy" Apr 24 22:20:50.284234 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284208 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" Apr 24 22:20:50.284234 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284213 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" Apr 24 22:20:50.284441 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284257 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kube-rbac-proxy" Apr 24 22:20:50.284441 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.284266 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0cfc683-b931-4dc5-ac45-a9f4c7232481" containerName="kserve-container" Apr 24 22:20:50.287122 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.287106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.289494 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.289473 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:20:50.289779 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.289763 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 22:20:50.299239 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.299219 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn"] Apr 24 22:20:50.394254 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.394223 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.394416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.394273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.394416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.394338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.394416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.394383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws8t4\" (UniqueName: \"kubernetes.io/projected/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kube-api-access-ws8t4\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.495653 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.495621 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws8t4\" (UniqueName: \"kubernetes.io/projected/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kube-api-access-ws8t4\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.495771 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.495667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.495771 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.495716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.495771 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.495741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.495883 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:50.495834 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-serving-cert: secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 24 22:20:50.495929 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:50.495890 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls podName:4019f012-e5ba-46b2-b1a6-8dec38dda6c7 nodeName:}" failed. No retries permitted until 2026-04-24 22:20:50.995871403 +0000 UTC m=+3251.843915499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls") pod "xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" (UID: "4019f012-e5ba-46b2-b1a6-8dec38dda6c7") : secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 24 22:20:50.496146 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.496122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.496406 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.496389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:50.507176 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:50.507158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws8t4\" (UniqueName: \"kubernetes.io/projected/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kube-api-access-ws8t4\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:51.000395 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:51.000366 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:51.002984 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:51.002957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-vwxgn\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:51.154855 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:51.154822 2571 generic.go:358] "Generic (PLEG): container finished" podID="e6a84850-46be-46f3-ace8-6048a654b706" containerID="b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b" exitCode=2 Apr 24 22:20:51.155010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:51.154865 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerDied","Data":"b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b"} Apr 24 22:20:51.196120 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:51.196092 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:51.312451 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:51.312391 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn"] Apr 24 22:20:51.316170 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:20:51.316143 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4019f012_e5ba_46b2_b1a6_8dec38dda6c7.slice/crio-ed5188e1e6abcb28b90f3542e20213fac9a0783eebf4f49f658a374c7217c739 WatchSource:0}: Error finding container ed5188e1e6abcb28b90f3542e20213fac9a0783eebf4f49f658a374c7217c739: Status 404 returned error can't find the container with id ed5188e1e6abcb28b90f3542e20213fac9a0783eebf4f49f658a374c7217c739 Apr 24 22:20:52.158659 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:52.158623 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerStarted","Data":"cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e"} Apr 24 22:20:52.158659 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:52.158661 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerStarted","Data":"ed5188e1e6abcb28b90f3542e20213fac9a0783eebf4f49f658a374c7217c739"} Apr 24 22:20:53.027581 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:53.027543 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 24 22:20:53.033541 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:53.033517 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.54:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 22:20:55.167070 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:55.166976 2571 generic.go:358] "Generic (PLEG): container finished" podID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerID="cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e" exitCode=0 Apr 24 22:20:55.167070 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:55.167051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerDied","Data":"cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e"} Apr 24 22:20:56.171801 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.171770 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerStarted","Data":"6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2"} Apr 24 22:20:56.172237 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.171811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerStarted","Data":"bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84"} Apr 24 22:20:56.172237 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.172023 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:56.172237 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.172052 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:20:56.192618 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.192576 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" podStartSLOduration=6.19256175 podStartE2EDuration="6.19256175s" podCreationTimestamp="2026-04-24 22:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:20:56.190254606 +0000 UTC m=+3257.038298732" watchObservedRunningTime="2026-04-24 22:20:56.19256175 +0000 UTC m=+3257.040605864" Apr 24 22:20:56.554365 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.554342 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:56.644660 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.644623 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6a84850-46be-46f3-ace8-6048a654b706-proxy-tls\") pod \"e6a84850-46be-46f3-ace8-6048a654b706\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " Apr 24 22:20:56.644877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.644696 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6a84850-46be-46f3-ace8-6048a654b706-kserve-provision-location\") pod \"e6a84850-46be-46f3-ace8-6048a654b706\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " Apr 24 22:20:56.644877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.644746 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6a84850-46be-46f3-ace8-6048a654b706-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"e6a84850-46be-46f3-ace8-6048a654b706\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " Apr 24 22:20:56.644877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.644789 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7ns6\" (UniqueName: \"kubernetes.io/projected/e6a84850-46be-46f3-ace8-6048a654b706-kube-api-access-v7ns6\") pod \"e6a84850-46be-46f3-ace8-6048a654b706\" (UID: \"e6a84850-46be-46f3-ace8-6048a654b706\") " Apr 24 22:20:56.645061 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.645014 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a84850-46be-46f3-ace8-6048a654b706-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e6a84850-46be-46f3-ace8-6048a654b706" (UID: "e6a84850-46be-46f3-ace8-6048a654b706"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:20:56.645104 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.645057 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a84850-46be-46f3-ace8-6048a654b706-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "e6a84850-46be-46f3-ace8-6048a654b706" (UID: "e6a84850-46be-46f3-ace8-6048a654b706"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:20:56.646910 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.646887 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a84850-46be-46f3-ace8-6048a654b706-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e6a84850-46be-46f3-ace8-6048a654b706" (UID: "e6a84850-46be-46f3-ace8-6048a654b706"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:20:56.647001 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.646976 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a84850-46be-46f3-ace8-6048a654b706-kube-api-access-v7ns6" (OuterVolumeSpecName: "kube-api-access-v7ns6") pod "e6a84850-46be-46f3-ace8-6048a654b706" (UID: "e6a84850-46be-46f3-ace8-6048a654b706"). InnerVolumeSpecName "kube-api-access-v7ns6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:20:56.746374 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.746342 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6a84850-46be-46f3-ace8-6048a654b706-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:56.746374 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.746371 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6a84850-46be-46f3-ace8-6048a654b706-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:56.746624 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.746382 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6a84850-46be-46f3-ace8-6048a654b706-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:56.746624 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:56.746393 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7ns6\" (UniqueName: \"kubernetes.io/projected/e6a84850-46be-46f3-ace8-6048a654b706-kube-api-access-v7ns6\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:20:57.176236 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.176152 2571 generic.go:358] "Generic (PLEG): container finished" podID="e6a84850-46be-46f3-ace8-6048a654b706" containerID="f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8" exitCode=0 Apr 24 22:20:57.176236 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.176227 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" Apr 24 22:20:57.176712 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.176234 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerDied","Data":"f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8"} Apr 24 22:20:57.176712 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.176272 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc" event={"ID":"e6a84850-46be-46f3-ace8-6048a654b706","Type":"ContainerDied","Data":"517cba52228a68b21ff2f9ad2d21034a6b61924b9befff31defd31e0a9a7d2b6"} Apr 24 22:20:57.176712 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.176306 2571 scope.go:117] "RemoveContainer" containerID="b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b" Apr 24 22:20:57.184957 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.184937 2571 scope.go:117] "RemoveContainer" containerID="f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8" Apr 24 22:20:57.191857 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.191841 2571 scope.go:117] "RemoveContainer" containerID="5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb" Apr 24 22:20:57.196903 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.196882 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc"] Apr 24 22:20:57.199395 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.199383 2571 scope.go:117] "RemoveContainer" containerID="b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b" Apr 24 22:20:57.199688 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:57.199667 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b\": container with ID starting with b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b not found: ID does not exist" containerID="b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b" Apr 24 22:20:57.199754 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.199698 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b"} err="failed to get container status \"b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b\": rpc error: code = NotFound desc = could not find container \"b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b\": container with ID starting with b284b8a1cac00874eccb0b021557c5ee5cd7f81e2189bbcb1d44095998c3178b not found: ID does not exist" Apr 24 22:20:57.199754 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.199720 2571 scope.go:117] "RemoveContainer" containerID="f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8" Apr 24 22:20:57.200008 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:57.199981 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8\": container with ID starting with f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8 not found: ID does not exist" containerID="f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8" Apr 24 22:20:57.200058 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.200007 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8"} err="failed to get container status \"f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8\": rpc error: code = NotFound desc = could not find container \"f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8\": container with ID starting with f9ab24e650aa8ae1ae1399baf78a01f111c1c36339d9743fa4d7f234104e23d8 not found: ID does not exist" Apr 24 22:20:57.200058 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.200025 2571 scope.go:117] "RemoveContainer" containerID="5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb" Apr 24 22:20:57.200156 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.200117 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-7g2mc"] Apr 24 22:20:57.200290 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:20:57.200272 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb\": container with ID starting with 5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb not found: ID does not exist" containerID="5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb" Apr 24 22:20:57.200370 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.200312 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb"} err="failed to get container status \"5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb\": rpc error: code = NotFound desc = could not find container \"5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb\": container with ID starting with 5e8a6abadafd9ceb6623f4222c73729ca1adab10c9e8e3c7d4fc87179094e7eb not found: ID does not exist" Apr 24 22:20:57.647553 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:20:57.647521 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a84850-46be-46f3-ace8-6048a654b706" path="/var/lib/kubelet/pods/e6a84850-46be-46f3-ace8-6048a654b706/volumes" Apr 24 22:21:02.182600 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:02.182567 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:21:32.187052 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:32.187022 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:21:40.346902 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.346864 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn"] Apr 24 22:21:40.347387 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.347167 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kserve-container" containerID="cri-o://bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84" gracePeriod=30 Apr 24 22:21:40.347387 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.347213 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kube-rbac-proxy" containerID="cri-o://6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2" gracePeriod=30 Apr 24 22:21:40.441989 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.441953 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk"] Apr 24 22:21:40.442276 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442263 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kube-rbac-proxy" Apr 24 22:21:40.442341 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442279 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kube-rbac-proxy" Apr 24 22:21:40.442341 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442287 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kserve-container" Apr 24 22:21:40.442341 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442305 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kserve-container" Apr 24 22:21:40.442341 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442314 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="storage-initializer" Apr 24 22:21:40.442341 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442320 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="storage-initializer" Apr 24 22:21:40.442508 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442366 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kserve-container" Apr 24 22:21:40.442508 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.442378 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6a84850-46be-46f3-ace8-6048a654b706" containerName="kube-rbac-proxy" Apr 24 22:21:40.445614 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.445597 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.448060 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.448039 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 24 22:21:40.448060 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.448053 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:21:40.456260 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.456239 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk"] Apr 24 22:21:40.481147 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.481126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvt4t\" (UniqueName: \"kubernetes.io/projected/34d4ce16-3bac-454b-aefb-989e7f9397b7-kube-api-access-kvt4t\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.481235 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.481191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34d4ce16-3bac-454b-aefb-989e7f9397b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.481235 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.481218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34d4ce16-3bac-454b-aefb-989e7f9397b7-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.481332 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.481247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d4ce16-3bac-454b-aefb-989e7f9397b7-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.581605 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.581576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvt4t\" (UniqueName: \"kubernetes.io/projected/34d4ce16-3bac-454b-aefb-989e7f9397b7-kube-api-access-kvt4t\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.581732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.581651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34d4ce16-3bac-454b-aefb-989e7f9397b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.581732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.581671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34d4ce16-3bac-454b-aefb-989e7f9397b7-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.581732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.581700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d4ce16-3bac-454b-aefb-989e7f9397b7-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.582077 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.582057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34d4ce16-3bac-454b-aefb-989e7f9397b7-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.582394 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.582371 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34d4ce16-3bac-454b-aefb-989e7f9397b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.584090 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.584066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d4ce16-3bac-454b-aefb-989e7f9397b7-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.590718 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.590695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvt4t\" (UniqueName: \"kubernetes.io/projected/34d4ce16-3bac-454b-aefb-989e7f9397b7-kube-api-access-kvt4t\") pod \"isvc-xgboost-runtime-predictor-779db84d9-6nqmk\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.756875 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.756847 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:40.872995 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:40.872963 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk"] Apr 24 22:21:40.878521 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:21:40.876489 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d4ce16_3bac_454b_aefb_989e7f9397b7.slice/crio-2eac37a588350f60bef51780615042b83d0749b0539dd01f09a4125a9d5e3010 WatchSource:0}: Error finding container 2eac37a588350f60bef51780615042b83d0749b0539dd01f09a4125a9d5e3010: Status 404 returned error can't find the container with id 2eac37a588350f60bef51780615042b83d0749b0539dd01f09a4125a9d5e3010 Apr 24 22:21:41.293704 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:41.293670 2571 generic.go:358] "Generic (PLEG): container finished" podID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerID="6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2" exitCode=2 Apr 24 22:21:41.293885 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:41.293729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerDied","Data":"6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2"} Apr 24 22:21:41.295108 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:41.295087 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerStarted","Data":"2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba"} Apr 24 22:21:41.295225 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:41.295112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerStarted","Data":"2eac37a588350f60bef51780615042b83d0749b0539dd01f09a4125a9d5e3010"} Apr 24 22:21:42.176857 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:42.176811 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.55:8643/healthz\": dial tcp 10.134.0.55:8643: connect: connection refused" Apr 24 22:21:45.308073 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:45.308041 2571 generic.go:358] "Generic (PLEG): container finished" podID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerID="2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba" exitCode=0 Apr 24 22:21:45.308443 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:45.308094 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerDied","Data":"2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba"} Apr 24 22:21:46.312458 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.312415 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerStarted","Data":"820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6"} Apr 24 22:21:46.312458 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.312454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerStarted","Data":"d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1"} Apr 24 22:21:46.312984 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.312671 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:46.335356 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.335222 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podStartSLOduration=6.33520612 podStartE2EDuration="6.33520612s" podCreationTimestamp="2026-04-24 22:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:21:46.33462377 +0000 UTC m=+3307.182667885" watchObservedRunningTime="2026-04-24 22:21:46.33520612 +0000 UTC m=+3307.183250236" Apr 24 22:21:46.475525 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.475502 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:21:46.521208 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.521183 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kserve-provision-location\") pod \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " Apr 24 22:21:46.521392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.521220 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls\") pod \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " Apr 24 22:21:46.521392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.521253 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8t4\" (UniqueName: \"kubernetes.io/projected/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kube-api-access-ws8t4\") pod \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " Apr 24 22:21:46.521392 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.521328 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\" (UID: \"4019f012-e5ba-46b2-b1a6-8dec38dda6c7\") " Apr 24 22:21:46.521649 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.521621 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4019f012-e5ba-46b2-b1a6-8dec38dda6c7" (UID: "4019f012-e5ba-46b2-b1a6-8dec38dda6c7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:21:46.521727 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.521628 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "4019f012-e5ba-46b2-b1a6-8dec38dda6c7" (UID: "4019f012-e5ba-46b2-b1a6-8dec38dda6c7"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:21:46.523369 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.523352 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4019f012-e5ba-46b2-b1a6-8dec38dda6c7" (UID: "4019f012-e5ba-46b2-b1a6-8dec38dda6c7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:21:46.523506 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.523492 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kube-api-access-ws8t4" (OuterVolumeSpecName: "kube-api-access-ws8t4") pod "4019f012-e5ba-46b2-b1a6-8dec38dda6c7" (UID: "4019f012-e5ba-46b2-b1a6-8dec38dda6c7"). InnerVolumeSpecName "kube-api-access-ws8t4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:21:46.622530 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.622448 2571 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:21:46.622530 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.622477 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:21:46.622530 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.622487 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:21:46.622530 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:46.622496 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ws8t4\" (UniqueName: \"kubernetes.io/projected/4019f012-e5ba-46b2-b1a6-8dec38dda6c7-kube-api-access-ws8t4\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:21:47.316428 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.316389 2571 generic.go:358] "Generic (PLEG): container finished" podID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerID="bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84" exitCode=0 Apr 24 22:21:47.316878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.316476 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" Apr 24 22:21:47.316878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.316469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerDied","Data":"bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84"} Apr 24 22:21:47.316878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.316633 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn" event={"ID":"4019f012-e5ba-46b2-b1a6-8dec38dda6c7","Type":"ContainerDied","Data":"ed5188e1e6abcb28b90f3542e20213fac9a0783eebf4f49f658a374c7217c739"} Apr 24 22:21:47.316878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.316653 2571 scope.go:117] "RemoveContainer" containerID="6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2" Apr 24 22:21:47.317275 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.317197 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:47.318152 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.318116 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:21:47.325094 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.325075 2571 scope.go:117] "RemoveContainer" containerID="bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84" Apr 24 22:21:47.332723 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.332709 2571 scope.go:117] "RemoveContainer" containerID="cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e" Apr 24 22:21:47.337901 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.337881 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn"] Apr 24 22:21:47.340031 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.340013 2571 scope.go:117] "RemoveContainer" containerID="6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2" Apr 24 22:21:47.340279 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:21:47.340262 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2\": container with ID starting with 6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2 not found: ID does not exist" containerID="6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2" Apr 24 22:21:47.340366 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.340312 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2"} err="failed to get container status \"6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2\": rpc error: code = NotFound desc = could not find container \"6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2\": container with ID starting with 6c17551885d6f2b4de961d8f490cdb016e613dbfae3571f17d1243f49c0452a2 not found: ID does not exist" Apr 24 22:21:47.340366 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.340329 2571 scope.go:117] "RemoveContainer" containerID="bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84" Apr 24 22:21:47.340538 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:21:47.340524 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84\": container with ID starting with bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84 not found: ID does not exist" containerID="bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84" Apr 24 22:21:47.340578 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.340542 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84"} err="failed to get container status \"bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84\": rpc error: code = NotFound desc = could not find container \"bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84\": container with ID starting with bcdfdfa3fc4286dc4dba5ff9018208cae3f55379e00888e39c278b57491dec84 not found: ID does not exist" Apr 24 22:21:47.340578 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.340554 2571 scope.go:117] "RemoveContainer" containerID="cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e" Apr 24 22:21:47.340741 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:21:47.340729 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e\": container with ID starting with cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e not found: ID does not exist" containerID="cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e" Apr 24 22:21:47.340792 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.340745 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e"} err="failed to get container status \"cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e\": rpc error: code = NotFound desc = could not find container \"cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e\": container with ID starting with cfd9ece74dd72fded9efde5e49859880fefa66e96c884ef1ac02ce75526a534e not found: ID does not exist" Apr 24 22:21:47.344498 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.344473 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-vwxgn"] Apr 24 22:21:47.647102 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:47.647027 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" path="/var/lib/kubelet/pods/4019f012-e5ba-46b2-b1a6-8dec38dda6c7/volumes" Apr 24 22:21:48.319730 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:48.319690 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:21:53.324477 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:53.324446 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:21:53.325104 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:21:53.325080 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:22:03.325321 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:03.325215 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:22:13.325317 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:13.325250 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:22:23.325449 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:23.325401 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:22:33.325533 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:33.325493 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:22:43.326091 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:43.326053 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:22:50.529929 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.529890 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk"] Apr 24 22:22:50.530453 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.530182 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" containerID="cri-o://d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1" gracePeriod=30 Apr 24 22:22:50.530453 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.530224 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kube-rbac-proxy" containerID="cri-o://820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6" gracePeriod=30 Apr 24 22:22:50.627248 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627215 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m"] Apr 24 22:22:50.627512 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627500 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kube-rbac-proxy" Apr 24 22:22:50.627565 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627514 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kube-rbac-proxy" Apr 24 22:22:50.627565 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627530 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kserve-container" Apr 24 22:22:50.627565 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627535 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kserve-container" Apr 24 22:22:50.627565 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627560 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="storage-initializer" Apr 24 22:22:50.627565 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627566 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="storage-initializer" Apr 24 22:22:50.627717 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627617 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kserve-container" Apr 24 22:22:50.627717 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.627624 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4019f012-e5ba-46b2-b1a6-8dec38dda6c7" containerName="kube-rbac-proxy" Apr 24 22:22:50.630511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.630492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.632763 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.632736 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:22:50.632854 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.632807 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 24 22:22:50.639735 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.639716 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m"] Apr 24 22:22:50.811839 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.811752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9799560a-00b2-4104-b641-7a3332423e84-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.811839 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.811791 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.811839 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.811810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9799560a-00b2-4104-b641-7a3332423e84-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.812065 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.811893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjfxx\" (UniqueName: \"kubernetes.io/projected/9799560a-00b2-4104-b641-7a3332423e84-kube-api-access-pjfxx\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.912645 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.912607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9799560a-00b2-4104-b641-7a3332423e84-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.912645 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.912650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.912882 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.912677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9799560a-00b2-4104-b641-7a3332423e84-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.912882 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.912708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjfxx\" (UniqueName: \"kubernetes.io/projected/9799560a-00b2-4104-b641-7a3332423e84-kube-api-access-pjfxx\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.912882 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:22:50.912789 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-serving-cert: secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:22:50.912882 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:22:50.912865 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls podName:9799560a-00b2-4104-b641-7a3332423e84 nodeName:}" failed. No retries permitted until 2026-04-24 22:22:51.412842835 +0000 UTC m=+3372.260886929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls") pod "isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" (UID: "9799560a-00b2-4104-b641-7a3332423e84") : secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:22:50.913095 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.913073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9799560a-00b2-4104-b641-7a3332423e84-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.913343 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.913326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9799560a-00b2-4104-b641-7a3332423e84-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:50.921721 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:50.921694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjfxx\" (UniqueName: \"kubernetes.io/projected/9799560a-00b2-4104-b641-7a3332423e84-kube-api-access-pjfxx\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:51.417692 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:51.417647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:51.417873 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:22:51.417802 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-serving-cert: secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:22:51.417917 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:22:51.417876 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls podName:9799560a-00b2-4104-b641-7a3332423e84 nodeName:}" failed. No retries permitted until 2026-04-24 22:22:52.417859691 +0000 UTC m=+3373.265903784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls") pod "isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" (UID: "9799560a-00b2-4104-b641-7a3332423e84") : secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 24 22:22:51.497187 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:51.497155 2571 generic.go:358] "Generic (PLEG): container finished" podID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerID="820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6" exitCode=2 Apr 24 22:22:51.497371 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:51.497226 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerDied","Data":"820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6"} Apr 24 22:22:52.426633 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:52.426603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:52.429264 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:52.429227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:52.441145 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:52.441119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:52.563356 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:52.563274 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m"] Apr 24 22:22:52.565840 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:22:52.565809 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9799560a_00b2_4104_b641_7a3332423e84.slice/crio-95bbe881f1cb2875b90d0a9a91c7e252ee327248a706d9564e78649b8acc698a WatchSource:0}: Error finding container 95bbe881f1cb2875b90d0a9a91c7e252ee327248a706d9564e78649b8acc698a: Status 404 returned error can't find the container with id 95bbe881f1cb2875b90d0a9a91c7e252ee327248a706d9564e78649b8acc698a Apr 24 22:22:53.320568 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:53.320524 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.56:8643/healthz\": dial tcp 10.134.0.56:8643: connect: connection refused" Apr 24 22:22:53.325874 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:53.325846 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 22:22:53.504947 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:53.504910 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerStarted","Data":"927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a"} Apr 24 22:22:53.504947 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:53.504952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerStarted","Data":"95bbe881f1cb2875b90d0a9a91c7e252ee327248a706d9564e78649b8acc698a"} Apr 24 22:22:54.160559 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.160537 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:22:54.341349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.341219 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34d4ce16-3bac-454b-aefb-989e7f9397b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"34d4ce16-3bac-454b-aefb-989e7f9397b7\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " Apr 24 22:22:54.341349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.341290 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvt4t\" (UniqueName: \"kubernetes.io/projected/34d4ce16-3bac-454b-aefb-989e7f9397b7-kube-api-access-kvt4t\") pod \"34d4ce16-3bac-454b-aefb-989e7f9397b7\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " Apr 24 22:22:54.341349 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.341338 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34d4ce16-3bac-454b-aefb-989e7f9397b7-kserve-provision-location\") pod \"34d4ce16-3bac-454b-aefb-989e7f9397b7\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " Apr 24 22:22:54.341673 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.341364 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d4ce16-3bac-454b-aefb-989e7f9397b7-proxy-tls\") pod \"34d4ce16-3bac-454b-aefb-989e7f9397b7\" (UID: \"34d4ce16-3bac-454b-aefb-989e7f9397b7\") " Apr 24 22:22:54.341673 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.341657 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d4ce16-3bac-454b-aefb-989e7f9397b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "34d4ce16-3bac-454b-aefb-989e7f9397b7" (UID: "34d4ce16-3bac-454b-aefb-989e7f9397b7"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:22:54.341787 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.341713 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d4ce16-3bac-454b-aefb-989e7f9397b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34d4ce16-3bac-454b-aefb-989e7f9397b7" (UID: "34d4ce16-3bac-454b-aefb-989e7f9397b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:22:54.343717 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.343694 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d4ce16-3bac-454b-aefb-989e7f9397b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "34d4ce16-3bac-454b-aefb-989e7f9397b7" (UID: "34d4ce16-3bac-454b-aefb-989e7f9397b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:22:54.343717 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.343706 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d4ce16-3bac-454b-aefb-989e7f9397b7-kube-api-access-kvt4t" (OuterVolumeSpecName: "kube-api-access-kvt4t") pod "34d4ce16-3bac-454b-aefb-989e7f9397b7" (UID: "34d4ce16-3bac-454b-aefb-989e7f9397b7"). InnerVolumeSpecName "kube-api-access-kvt4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:22:54.442734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.442697 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34d4ce16-3bac-454b-aefb-989e7f9397b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:22:54.442734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.442725 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvt4t\" (UniqueName: \"kubernetes.io/projected/34d4ce16-3bac-454b-aefb-989e7f9397b7-kube-api-access-kvt4t\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:22:54.442734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.442735 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34d4ce16-3bac-454b-aefb-989e7f9397b7-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:22:54.442734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.442744 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34d4ce16-3bac-454b-aefb-989e7f9397b7-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:22:54.509686 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.509648 2571 generic.go:358] "Generic (PLEG): container finished" podID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerID="d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1" exitCode=0 Apr 24 22:22:54.510063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.509745 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" Apr 24 22:22:54.510063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.509744 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerDied","Data":"d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1"} Apr 24 22:22:54.510063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.509786 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk" event={"ID":"34d4ce16-3bac-454b-aefb-989e7f9397b7","Type":"ContainerDied","Data":"2eac37a588350f60bef51780615042b83d0749b0539dd01f09a4125a9d5e3010"} Apr 24 22:22:54.510063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.509806 2571 scope.go:117] "RemoveContainer" containerID="820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6" Apr 24 22:22:54.517681 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.517664 2571 scope.go:117] "RemoveContainer" containerID="d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1" Apr 24 22:22:54.528401 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.528382 2571 scope.go:117] "RemoveContainer" containerID="2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba" Apr 24 22:22:54.532586 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.532564 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk"] Apr 24 22:22:54.535462 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.535447 2571 scope.go:117] "RemoveContainer" containerID="820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6" Apr 24 22:22:54.535676 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:22:54.535660 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6\": container with ID starting with 820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6 not found: ID does not exist" containerID="820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6" Apr 24 22:22:54.535716 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.535683 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6"} err="failed to get container status \"820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6\": rpc error: code = NotFound desc = could not find container \"820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6\": container with ID starting with 820b89a265894eff67f403309191c680ee46f95a5c0b2f7a085e49c9432aafd6 not found: ID does not exist" Apr 24 22:22:54.535716 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.535701 2571 scope.go:117] "RemoveContainer" containerID="d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1" Apr 24 22:22:54.535909 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:22:54.535896 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1\": container with ID starting with d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1 not found: ID does not exist" containerID="d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1" Apr 24 22:22:54.535951 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.535911 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1"} err="failed to get container status \"d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1\": rpc error: code = NotFound desc = could not find container \"d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1\": container with ID starting with d8d51bf0a6dae27fee9a6a07173e703c7cb859489795537655adf5b7a0f640d1 not found: ID does not exist" Apr 24 22:22:54.535951 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.535922 2571 scope.go:117] "RemoveContainer" containerID="2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba" Apr 24 22:22:54.536096 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:22:54.536079 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba\": container with ID starting with 2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba not found: ID does not exist" containerID="2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba" Apr 24 22:22:54.536139 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.536098 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba"} err="failed to get container status \"2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba\": rpc error: code = NotFound desc = could not find container \"2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba\": container with ID starting with 2c099565b05315e120636a6c7e7d34f3148090c6ac900922aeb0e226b4f508ba not found: ID does not exist" Apr 24 22:22:54.537968 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:54.537948 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-6nqmk"] Apr 24 22:22:55.648723 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:55.648690 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" path="/var/lib/kubelet/pods/34d4ce16-3bac-454b-aefb-989e7f9397b7/volumes" Apr 24 22:22:56.517544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:56.517449 2571 generic.go:358] "Generic (PLEG): container finished" podID="9799560a-00b2-4104-b641-7a3332423e84" containerID="927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a" exitCode=0 Apr 24 22:22:56.517544 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:56.517530 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerDied","Data":"927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a"} Apr 24 22:22:57.521612 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:57.521578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerStarted","Data":"0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb"} Apr 24 22:22:57.521948 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:57.521617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerStarted","Data":"8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd"} Apr 24 22:22:57.521948 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:57.521815 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:22:57.550736 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:57.550695 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" podStartSLOduration=7.550680597 podStartE2EDuration="7.550680597s" podCreationTimestamp="2026-04-24 22:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:22:57.550100246 +0000 UTC m=+3378.398144363" watchObservedRunningTime="2026-04-24 22:22:57.550680597 +0000 UTC m=+3378.398724713" Apr 24 22:22:58.525463 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:22:58.525435 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:23:04.533671 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:04.533639 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:23:34.626432 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:34.626334 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 22:23:44.536685 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:44.536654 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:23:50.737354 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.737319 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m"] Apr 24 22:23:50.737773 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.737651 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kserve-container" containerID="cri-o://8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd" gracePeriod=30 Apr 24 22:23:50.737773 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.737692 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kube-rbac-proxy" containerID="cri-o://0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb" gracePeriod=30 Apr 24 22:23:50.813318 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813268 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt"] Apr 24 22:23:50.813603 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813588 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="storage-initializer" Apr 24 22:23:50.813660 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813608 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="storage-initializer" Apr 24 22:23:50.813660 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813621 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kube-rbac-proxy" Apr 24 22:23:50.813660 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813626 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kube-rbac-proxy" Apr 24 22:23:50.813660 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813635 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" Apr 24 22:23:50.813660 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813640 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" Apr 24 22:23:50.813818 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813690 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kube-rbac-proxy" Apr 24 22:23:50.813818 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.813703 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="34d4ce16-3bac-454b-aefb-989e7f9397b7" containerName="kserve-container" Apr 24 22:23:50.816696 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.816679 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:50.818941 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.818918 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 24 22:23:50.819055 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.818963 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:23:50.825695 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.825670 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt"] Apr 24 22:23:50.974500 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.974466 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09bd9953-b200-4d8a-b528-10e727d47637-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:50.974676 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.974507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:50.974676 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.974551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09bd9953-b200-4d8a-b528-10e727d47637-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:50.974676 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:50.974612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dllml\" (UniqueName: \"kubernetes.io/projected/09bd9953-b200-4d8a-b528-10e727d47637-kube-api-access-dllml\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.074972 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.074931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09bd9953-b200-4d8a-b528-10e727d47637-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.075164 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.074991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dllml\" (UniqueName: \"kubernetes.io/projected/09bd9953-b200-4d8a-b528-10e727d47637-kube-api-access-dllml\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.075164 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.075039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09bd9953-b200-4d8a-b528-10e727d47637-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.075164 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.075064 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.075360 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:23:51.075169 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 24 22:23:51.075360 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:23:51.075238 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls podName:09bd9953-b200-4d8a-b528-10e727d47637 nodeName:}" failed. No retries permitted until 2026-04-24 22:23:51.575216428 +0000 UTC m=+3432.423260523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" (UID: "09bd9953-b200-4d8a-b528-10e727d47637") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 24 22:23:51.075454 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.075418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09bd9953-b200-4d8a-b528-10e727d47637-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.075709 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.075691 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09bd9953-b200-4d8a-b528-10e727d47637-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.084476 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.084449 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dllml\" (UniqueName: \"kubernetes.io/projected/09bd9953-b200-4d8a-b528-10e727d47637-kube-api-access-dllml\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.579575 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.579522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.582055 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.582036 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.669847 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.669816 2571 generic.go:358] "Generic (PLEG): container finished" podID="9799560a-00b2-4104-b641-7a3332423e84" containerID="0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb" exitCode=2 Apr 24 22:23:51.670013 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.669891 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerDied","Data":"0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb"} Apr 24 22:23:51.727413 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.727375 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:51.849148 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.849073 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt"] Apr 24 22:23:51.852091 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:23:51.852064 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09bd9953_b200_4d8a_b528_10e727d47637.slice/crio-feb34c9394a293825c14c0d7b16e9a9342bafbaabea06c4660c6ce1d8de74067 WatchSource:0}: Error finding container feb34c9394a293825c14c0d7b16e9a9342bafbaabea06c4660c6ce1d8de74067: Status 404 returned error can't find the container with id feb34c9394a293825c14c0d7b16e9a9342bafbaabea06c4660c6ce1d8de74067 Apr 24 22:23:51.855278 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:51.854577 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:23:52.673649 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:52.673616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerStarted","Data":"1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58"} Apr 24 22:23:52.673649 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:52.673652 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerStarted","Data":"feb34c9394a293825c14c0d7b16e9a9342bafbaabea06c4660c6ce1d8de74067"} Apr 24 22:23:54.528743 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:54.528687 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.57:8643/healthz\": dial tcp 10.134.0.57:8643: connect: connection refused" Apr 24 22:23:54.534249 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:54.534214 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 24 22:23:55.682732 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:55.682698 2571 generic.go:358] "Generic (PLEG): container finished" podID="09bd9953-b200-4d8a-b528-10e727d47637" containerID="1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58" exitCode=0 Apr 24 22:23:55.683096 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:55.682748 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerDied","Data":"1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58"} Apr 24 22:23:56.687029 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:56.686991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerStarted","Data":"394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a"} Apr 24 22:23:56.687029 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:56.687024 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerStarted","Data":"153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d"} Apr 24 22:23:56.687454 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:56.687238 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:56.706181 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:56.706131 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podStartSLOduration=6.70611699 podStartE2EDuration="6.70611699s" podCreationTimestamp="2026-04-24 22:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:23:56.704719579 +0000 UTC m=+3437.552763707" watchObservedRunningTime="2026-04-24 22:23:56.70611699 +0000 UTC m=+3437.554161107" Apr 24 22:23:57.689792 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:57.689762 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:23:57.690983 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:57.690956 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:23:58.385783 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.385760 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:23:58.535753 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.535714 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjfxx\" (UniqueName: \"kubernetes.io/projected/9799560a-00b2-4104-b641-7a3332423e84-kube-api-access-pjfxx\") pod \"9799560a-00b2-4104-b641-7a3332423e84\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " Apr 24 22:23:58.535753 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.535762 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9799560a-00b2-4104-b641-7a3332423e84-kserve-provision-location\") pod \"9799560a-00b2-4104-b641-7a3332423e84\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " Apr 24 22:23:58.536016 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.535781 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9799560a-00b2-4104-b641-7a3332423e84-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"9799560a-00b2-4104-b641-7a3332423e84\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " Apr 24 22:23:58.536016 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.535806 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls\") pod \"9799560a-00b2-4104-b641-7a3332423e84\" (UID: \"9799560a-00b2-4104-b641-7a3332423e84\") " Apr 24 22:23:58.536134 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.536084 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9799560a-00b2-4104-b641-7a3332423e84-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9799560a-00b2-4104-b641-7a3332423e84" (UID: "9799560a-00b2-4104-b641-7a3332423e84"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:23:58.536198 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.536167 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9799560a-00b2-4104-b641-7a3332423e84-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "9799560a-00b2-4104-b641-7a3332423e84" (UID: "9799560a-00b2-4104-b641-7a3332423e84"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:23:58.538066 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.538036 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9799560a-00b2-4104-b641-7a3332423e84" (UID: "9799560a-00b2-4104-b641-7a3332423e84"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:23:58.538176 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.538075 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9799560a-00b2-4104-b641-7a3332423e84-kube-api-access-pjfxx" (OuterVolumeSpecName: "kube-api-access-pjfxx") pod "9799560a-00b2-4104-b641-7a3332423e84" (UID: "9799560a-00b2-4104-b641-7a3332423e84"). InnerVolumeSpecName "kube-api-access-pjfxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:23:58.636761 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.636715 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjfxx\" (UniqueName: \"kubernetes.io/projected/9799560a-00b2-4104-b641-7a3332423e84-kube-api-access-pjfxx\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:23:58.636761 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.636755 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9799560a-00b2-4104-b641-7a3332423e84-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:23:58.636761 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.636767 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9799560a-00b2-4104-b641-7a3332423e84-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:23:58.636989 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.636777 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9799560a-00b2-4104-b641-7a3332423e84-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:23:58.694261 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.694221 2571 generic.go:358] "Generic (PLEG): container finished" podID="9799560a-00b2-4104-b641-7a3332423e84" containerID="8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd" exitCode=0 Apr 24 22:23:58.694667 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.694322 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" Apr 24 22:23:58.694667 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.694330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerDied","Data":"8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd"} Apr 24 22:23:58.694667 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.694369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m" event={"ID":"9799560a-00b2-4104-b641-7a3332423e84","Type":"ContainerDied","Data":"95bbe881f1cb2875b90d0a9a91c7e252ee327248a706d9564e78649b8acc698a"} Apr 24 22:23:58.694667 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.694387 2571 scope.go:117] "RemoveContainer" containerID="0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb" Apr 24 22:23:58.695041 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.695012 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:23:58.702373 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.702357 2571 scope.go:117] "RemoveContainer" containerID="8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd" Apr 24 22:23:58.709539 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.709518 2571 scope.go:117] "RemoveContainer" containerID="927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a" Apr 24 22:23:58.715264 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.715239 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m"] Apr 24 22:23:58.718578 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.718560 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nk56m"] Apr 24 22:23:58.718636 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.718606 2571 scope.go:117] "RemoveContainer" containerID="0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb" Apr 24 22:23:58.718890 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:23:58.718875 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb\": container with ID starting with 0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb not found: ID does not exist" containerID="0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb" Apr 24 22:23:58.718929 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.718899 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb"} err="failed to get container status \"0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb\": rpc error: code = NotFound desc = could not find container \"0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb\": container with ID starting with 0778cd9ba05993b1c7d70044b27f295a897956963f200536fe7705b8e0ca1feb not found: ID does not exist" Apr 24 22:23:58.718929 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.718916 2571 scope.go:117] "RemoveContainer" containerID="8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd" Apr 24 22:23:58.719143 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:23:58.719128 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd\": container with ID starting with 8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd not found: ID does not exist" containerID="8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd" Apr 24 22:23:58.719185 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.719148 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd"} err="failed to get container status \"8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd\": rpc error: code = NotFound desc = could not find container \"8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd\": container with ID starting with 8e57b02bfc2146e73531a93391f55c2a73f4128c4be52e666576a72b851c7abd not found: ID does not exist" Apr 24 22:23:58.719185 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.719161 2571 scope.go:117] "RemoveContainer" containerID="927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a" Apr 24 22:23:58.719361 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:23:58.719346 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a\": container with ID starting with 927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a not found: ID does not exist" containerID="927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a" Apr 24 22:23:58.719405 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:58.719368 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a"} err="failed to get container status \"927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a\": rpc error: code = NotFound desc = could not find container \"927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a\": container with ID starting with 927134fc34b2ee7872b03aafb1be5e91cc2be6f351b9794ebfd448779b1c2a3a not found: ID does not exist" Apr 24 22:23:59.652485 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:23:59.652450 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9799560a-00b2-4104-b641-7a3332423e84" path="/var/lib/kubelet/pods/9799560a-00b2-4104-b641-7a3332423e84/volumes" Apr 24 22:24:03.700656 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:24:03.700629 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:24:03.701209 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:24:03.701183 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:24:13.702050 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:24:13.702005 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:24:23.701257 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:24:23.701211 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:24:33.701430 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:24:33.701389 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:24:43.702114 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:24:43.702074 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:24:53.701851 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:24:53.701814 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:25:03.701975 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:03.701896 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:25:10.948015 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:10.947981 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt"] Apr 24 22:25:10.948447 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:10.948262 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" containerID="cri-o://153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d" gracePeriod=30 Apr 24 22:25:10.948447 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:10.948284 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kube-rbac-proxy" containerID="cri-o://394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a" gracePeriod=30 Apr 24 22:25:11.027137 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027106 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9"] Apr 24 22:25:11.027458 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027444 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="storage-initializer" Apr 24 22:25:11.027458 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027459 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="storage-initializer" Apr 24 22:25:11.027554 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027477 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kserve-container" Apr 24 22:25:11.027554 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027487 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kserve-container" Apr 24 22:25:11.027554 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027494 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kube-rbac-proxy" Apr 24 22:25:11.027554 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027500 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kube-rbac-proxy" Apr 24 22:25:11.027554 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027547 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kube-rbac-proxy" Apr 24 22:25:11.027704 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.027560 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9799560a-00b2-4104-b641-7a3332423e84" containerName="kserve-container" Apr 24 22:25:11.030565 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.030546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.033359 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.033337 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 24 22:25:11.033359 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.033348 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 22:25:11.033533 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.033344 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 24 22:25:11.040550 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.040527 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9"] Apr 24 22:25:11.103373 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.103342 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.103497 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.103396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.103497 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.103452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d5faeda-c627-4072-8c9d-28dd0dad36b3-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.103497 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.103478 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dmrf\" (UniqueName: \"kubernetes.io/projected/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kube-api-access-8dmrf\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.204822 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.204738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.204822 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.204785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d5faeda-c627-4072-8c9d-28dd0dad36b3-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.204822 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.204815 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dmrf\" (UniqueName: \"kubernetes.io/projected/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kube-api-access-8dmrf\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.205085 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:25:11.204895 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 22:25:11.205085 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.204899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.205085 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:25:11.204972 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls podName:5d5faeda-c627-4072-8c9d-28dd0dad36b3 nodeName:}" failed. No retries permitted until 2026-04-24 22:25:11.704949869 +0000 UTC m=+3512.552993977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls") pod "isvc-sklearn-s3-predictor-d954bcd99-79lq9" (UID: "5d5faeda-c627-4072-8c9d-28dd0dad36b3") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 22:25:11.205320 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.205284 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.205524 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.205505 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d5faeda-c627-4072-8c9d-28dd0dad36b3-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.213642 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.213611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dmrf\" (UniqueName: \"kubernetes.io/projected/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kube-api-access-8dmrf\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.709589 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.709557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.712030 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.712011 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls\") pod \"isvc-sklearn-s3-predictor-d954bcd99-79lq9\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:11.899018 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.898986 2571 generic.go:358] "Generic (PLEG): container finished" podID="09bd9953-b200-4d8a-b528-10e727d47637" containerID="394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a" exitCode=2 Apr 24 22:25:11.899182 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.899061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerDied","Data":"394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a"} Apr 24 22:25:11.940878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:11.940851 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:12.061641 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:12.061614 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9"] Apr 24 22:25:12.063549 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:25:12.063522 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d5faeda_c627_4072_8c9d_28dd0dad36b3.slice/crio-eb9b736e67fae83a7eec6b63106c8d61b008eabee55e26cc6d0059f0c6019666 WatchSource:0}: Error finding container eb9b736e67fae83a7eec6b63106c8d61b008eabee55e26cc6d0059f0c6019666: Status 404 returned error can't find the container with id eb9b736e67fae83a7eec6b63106c8d61b008eabee55e26cc6d0059f0c6019666 Apr 24 22:25:12.903096 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:12.903053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerStarted","Data":"f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264"} Apr 24 22:25:12.903096 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:12.903096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerStarted","Data":"eb9b736e67fae83a7eec6b63106c8d61b008eabee55e26cc6d0059f0c6019666"} Apr 24 22:25:13.695663 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:13.695625 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.58:8643/healthz\": dial tcp 10.134.0.58:8643: connect: connection refused" Apr 24 22:25:13.701333 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:13.701283 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 22:25:13.907566 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:13.907532 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerID="f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264" exitCode=0 Apr 24 22:25:13.907770 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:13.907619 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerDied","Data":"f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264"} Apr 24 22:25:14.479741 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.479721 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:25:14.633115 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.633022 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dllml\" (UniqueName: \"kubernetes.io/projected/09bd9953-b200-4d8a-b528-10e727d47637-kube-api-access-dllml\") pod \"09bd9953-b200-4d8a-b528-10e727d47637\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " Apr 24 22:25:14.633115 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.633061 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls\") pod \"09bd9953-b200-4d8a-b528-10e727d47637\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " Apr 24 22:25:14.633115 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.633113 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09bd9953-b200-4d8a-b528-10e727d47637-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"09bd9953-b200-4d8a-b528-10e727d47637\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " Apr 24 22:25:14.633456 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.633212 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09bd9953-b200-4d8a-b528-10e727d47637-kserve-provision-location\") pod \"09bd9953-b200-4d8a-b528-10e727d47637\" (UID: \"09bd9953-b200-4d8a-b528-10e727d47637\") " Apr 24 22:25:14.633518 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.633467 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bd9953-b200-4d8a-b528-10e727d47637-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "09bd9953-b200-4d8a-b528-10e727d47637" (UID: "09bd9953-b200-4d8a-b528-10e727d47637"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:25:14.633579 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.633546 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09bd9953-b200-4d8a-b528-10e727d47637-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "09bd9953-b200-4d8a-b528-10e727d47637" (UID: "09bd9953-b200-4d8a-b528-10e727d47637"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:25:14.635274 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.635252 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09bd9953-b200-4d8a-b528-10e727d47637-kube-api-access-dllml" (OuterVolumeSpecName: "kube-api-access-dllml") pod "09bd9953-b200-4d8a-b528-10e727d47637" (UID: "09bd9953-b200-4d8a-b528-10e727d47637"). InnerVolumeSpecName "kube-api-access-dllml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:25:14.635274 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.635257 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "09bd9953-b200-4d8a-b528-10e727d47637" (UID: "09bd9953-b200-4d8a-b528-10e727d47637"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:25:14.733749 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.733711 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dllml\" (UniqueName: \"kubernetes.io/projected/09bd9953-b200-4d8a-b528-10e727d47637-kube-api-access-dllml\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:25:14.733749 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.733741 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09bd9953-b200-4d8a-b528-10e727d47637-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:25:14.733749 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.733752 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/09bd9953-b200-4d8a-b528-10e727d47637-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:25:14.734274 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.733762 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09bd9953-b200-4d8a-b528-10e727d47637-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:25:14.912031 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.911948 2571 generic.go:358] "Generic (PLEG): container finished" podID="09bd9953-b200-4d8a-b528-10e727d47637" containerID="153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d" exitCode=0 Apr 24 22:25:14.912172 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.912035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerDied","Data":"153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d"} Apr 24 22:25:14.912172 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.912058 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" Apr 24 22:25:14.912172 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.912071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt" event={"ID":"09bd9953-b200-4d8a-b528-10e727d47637","Type":"ContainerDied","Data":"feb34c9394a293825c14c0d7b16e9a9342bafbaabea06c4660c6ce1d8de74067"} Apr 24 22:25:14.912172 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.912086 2571 scope.go:117] "RemoveContainer" containerID="394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a" Apr 24 22:25:14.914052 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.914030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerStarted","Data":"6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d"} Apr 24 22:25:14.914172 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.914061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerStarted","Data":"54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4"} Apr 24 22:25:14.914311 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.914277 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:14.914385 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.914327 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:14.915546 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.915523 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:25:14.920637 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.920622 2571 scope.go:117] "RemoveContainer" containerID="153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d" Apr 24 22:25:14.929447 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.929431 2571 scope.go:117] "RemoveContainer" containerID="1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58" Apr 24 22:25:14.935858 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.935836 2571 scope.go:117] "RemoveContainer" containerID="394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a" Apr 24 22:25:14.936098 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:25:14.936078 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a\": container with ID starting with 394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a not found: ID does not exist" containerID="394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a" Apr 24 22:25:14.936165 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.936106 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a"} err="failed to get container status \"394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a\": rpc error: code = NotFound desc = could not find container \"394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a\": container with ID starting with 394b4fe1572e30ced019f332844e302ce6eba5a4756ecbdf54b4989c9e73b98a not found: ID does not exist" Apr 24 22:25:14.936165 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.936123 2571 scope.go:117] "RemoveContainer" containerID="153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d" Apr 24 22:25:14.936369 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:25:14.936350 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d\": container with ID starting with 153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d not found: ID does not exist" containerID="153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d" Apr 24 22:25:14.936417 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.936377 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d"} err="failed to get container status \"153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d\": rpc error: code = NotFound desc = could not find container \"153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d\": container with ID starting with 153b7e188f3325826b87d628e73f5de90c6362368ef67cfcf72b898359f8899d not found: ID does not exist" Apr 24 22:25:14.936417 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.936395 2571 scope.go:117] "RemoveContainer" containerID="1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58" Apr 24 22:25:14.936661 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:25:14.936645 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58\": container with ID starting with 1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58 not found: ID does not exist" containerID="1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58" Apr 24 22:25:14.936722 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.936665 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58"} err="failed to get container status \"1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58\": rpc error: code = NotFound desc = could not find container \"1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58\": container with ID starting with 1348b9ecc892294615ff0a8eb2426aa5886a699c3888aa6fcab8df3caf5c1e58 not found: ID does not exist" Apr 24 22:25:14.940484 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.940448 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podStartSLOduration=3.940438064 podStartE2EDuration="3.940438064s" podCreationTimestamp="2026-04-24 22:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:25:14.938919078 +0000 UTC m=+3515.786963201" watchObservedRunningTime="2026-04-24 22:25:14.940438064 +0000 UTC m=+3515.788482179" Apr 24 22:25:14.952513 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.952490 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt"] Apr 24 22:25:14.957113 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:14.957092 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4k8vt"] Apr 24 22:25:15.647870 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:15.647831 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09bd9953-b200-4d8a-b528-10e727d47637" path="/var/lib/kubelet/pods/09bd9953-b200-4d8a-b528-10e727d47637/volumes" Apr 24 22:25:15.918364 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:15.918262 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:25:20.922665 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:20.922633 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:25:20.923274 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:20.923239 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:25:30.923868 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:30.923832 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:25:40.923837 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:40.923798 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:25:50.924119 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:25:50.924080 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:26:00.923771 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:00.923733 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:26:10.923927 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:10.923887 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 22:26:20.924081 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:20.924046 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:26:31.115541 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.115453 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9"] Apr 24 22:26:31.116006 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.115902 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" containerID="cri-o://54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4" gracePeriod=30 Apr 24 22:26:31.116091 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.115987 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kube-rbac-proxy" containerID="cri-o://6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d" gracePeriod=30 Apr 24 22:26:31.262790 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.262753 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc"] Apr 24 22:26:31.263087 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263072 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" Apr 24 22:26:31.263087 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263088 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" Apr 24 22:26:31.263192 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263107 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kube-rbac-proxy" Apr 24 22:26:31.263192 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263112 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kube-rbac-proxy" Apr 24 22:26:31.263192 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263120 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="storage-initializer" Apr 24 22:26:31.263192 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263126 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="storage-initializer" Apr 24 22:26:31.263192 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263167 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kserve-container" Apr 24 22:26:31.263192 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.263176 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="09bd9953-b200-4d8a-b528-10e727d47637" containerName="kube-rbac-proxy" Apr 24 22:26:31.266037 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.266020 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.268592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.268572 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 24 22:26:31.268852 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.268835 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:26:31.269010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.268991 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:26:31.275729 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.275709 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc"] Apr 24 22:26:31.436072 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.435991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87bb2a02-dd40-457e-9705-e3818f41d37e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.436072 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.436027 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.436072 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.436047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87bb2a02-dd40-457e-9705-e3818f41d37e-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.436288 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.436117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrlqc\" (UniqueName: \"kubernetes.io/projected/87bb2a02-dd40-457e-9705-e3818f41d37e-kube-api-access-mrlqc\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.436288 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.436185 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.536626 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.536582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87bb2a02-dd40-457e-9705-e3818f41d37e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.536626 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.536631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.536897 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.536660 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87bb2a02-dd40-457e-9705-e3818f41d37e-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.536897 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.536694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrlqc\" (UniqueName: \"kubernetes.io/projected/87bb2a02-dd40-457e-9705-e3818f41d37e-kube-api-access-mrlqc\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.536897 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.536758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.537061 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.537017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87bb2a02-dd40-457e-9705-e3818f41d37e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.537343 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.537290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.537474 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.537390 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.539267 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.539245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87bb2a02-dd40-457e-9705-e3818f41d37e-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.545717 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.545695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrlqc\" (UniqueName: \"kubernetes.io/projected/87bb2a02-dd40-457e-9705-e3818f41d37e-kube-api-access-mrlqc\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.576631 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.576609 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:31.697469 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:31.697398 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc"] Apr 24 22:26:31.701122 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:26:31.701095 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87bb2a02_dd40_457e_9705_e3818f41d37e.slice/crio-09f2d204d4523d4b04c548f210f61dc929279d2b3f59d395283a5cf382ff163c WatchSource:0}: Error finding container 09f2d204d4523d4b04c548f210f61dc929279d2b3f59d395283a5cf382ff163c: Status 404 returned error can't find the container with id 09f2d204d4523d4b04c548f210f61dc929279d2b3f59d395283a5cf382ff163c Apr 24 22:26:32.126555 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:32.126518 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerID="6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d" exitCode=2 Apr 24 22:26:32.126920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:32.126590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerDied","Data":"6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d"} Apr 24 22:26:32.128094 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:32.128025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerStarted","Data":"870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9"} Apr 24 22:26:32.128094 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:32.128060 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerStarted","Data":"09f2d204d4523d4b04c548f210f61dc929279d2b3f59d395283a5cf382ff163c"} Apr 24 22:26:33.132819 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:33.132785 2571 generic.go:358] "Generic (PLEG): container finished" podID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerID="870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9" exitCode=0 Apr 24 22:26:33.133210 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:33.132874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerDied","Data":"870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9"} Apr 24 22:26:34.138188 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:34.138150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerStarted","Data":"8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10"} Apr 24 22:26:34.138188 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:34.138188 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerStarted","Data":"31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4"} Apr 24 22:26:34.138644 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:34.138319 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:34.158971 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:34.158928 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podStartSLOduration=3.158914785 podStartE2EDuration="3.158914785s" podCreationTimestamp="2026-04-24 22:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:26:34.157869502 +0000 UTC m=+3595.005913651" watchObservedRunningTime="2026-04-24 22:26:34.158914785 +0000 UTC m=+3595.006958900" Apr 24 22:26:35.141308 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.141257 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:35.142423 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.142397 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:26:35.454093 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.454072 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:26:35.565965 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.565915 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d5faeda-c627-4072-8c9d-28dd0dad36b3-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " Apr 24 22:26:35.566141 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.566027 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls\") pod \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " Apr 24 22:26:35.566141 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.566076 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dmrf\" (UniqueName: \"kubernetes.io/projected/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kube-api-access-8dmrf\") pod \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " Apr 24 22:26:35.566141 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.566106 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kserve-provision-location\") pod \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\" (UID: \"5d5faeda-c627-4072-8c9d-28dd0dad36b3\") " Apr 24 22:26:35.566383 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.566356 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5faeda-c627-4072-8c9d-28dd0dad36b3-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "5d5faeda-c627-4072-8c9d-28dd0dad36b3" (UID: "5d5faeda-c627-4072-8c9d-28dd0dad36b3"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:26:35.566546 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.566523 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d5faeda-c627-4072-8c9d-28dd0dad36b3" (UID: "5d5faeda-c627-4072-8c9d-28dd0dad36b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:26:35.568205 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.568184 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d5faeda-c627-4072-8c9d-28dd0dad36b3" (UID: "5d5faeda-c627-4072-8c9d-28dd0dad36b3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:26:35.568309 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.568273 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kube-api-access-8dmrf" (OuterVolumeSpecName: "kube-api-access-8dmrf") pod "5d5faeda-c627-4072-8c9d-28dd0dad36b3" (UID: "5d5faeda-c627-4072-8c9d-28dd0dad36b3"). InnerVolumeSpecName "kube-api-access-8dmrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:26:35.666923 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.666894 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dmrf\" (UniqueName: \"kubernetes.io/projected/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kube-api-access-8dmrf\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:26:35.666923 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.666918 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d5faeda-c627-4072-8c9d-28dd0dad36b3-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:26:35.667060 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.666930 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d5faeda-c627-4072-8c9d-28dd0dad36b3-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:26:35.667060 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:35.666939 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d5faeda-c627-4072-8c9d-28dd0dad36b3-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:26:36.145221 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.145188 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerID="54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4" exitCode=0 Apr 24 22:26:36.145670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.145261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerDied","Data":"54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4"} Apr 24 22:26:36.145670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.145315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" event={"ID":"5d5faeda-c627-4072-8c9d-28dd0dad36b3","Type":"ContainerDied","Data":"eb9b736e67fae83a7eec6b63106c8d61b008eabee55e26cc6d0059f0c6019666"} Apr 24 22:26:36.145670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.145332 2571 scope.go:117] "RemoveContainer" containerID="6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d" Apr 24 22:26:36.145670 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.145358 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9" Apr 24 22:26:36.145917 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.145701 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:26:36.153471 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.153456 2571 scope.go:117] "RemoveContainer" containerID="54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4" Apr 24 22:26:36.163419 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.163402 2571 scope.go:117] "RemoveContainer" containerID="f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264" Apr 24 22:26:36.164476 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.164457 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9"] Apr 24 22:26:36.171653 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.168038 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d954bcd99-79lq9"] Apr 24 22:26:36.175462 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.175448 2571 scope.go:117] "RemoveContainer" containerID="6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d" Apr 24 22:26:36.175721 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:26:36.175703 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d\": container with ID starting with 6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d not found: ID does not exist" containerID="6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d" Apr 24 22:26:36.175797 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.175734 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d"} err="failed to get container status \"6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d\": rpc error: code = NotFound desc = could not find container \"6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d\": container with ID starting with 6feda9947c6fe3d6faba6a381ee0b0d0dee92c25f3368c0e59e759c41bcd0b2d not found: ID does not exist" Apr 24 22:26:36.175797 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.175760 2571 scope.go:117] "RemoveContainer" containerID="54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4" Apr 24 22:26:36.176020 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:26:36.176001 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4\": container with ID starting with 54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4 not found: ID does not exist" containerID="54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4" Apr 24 22:26:36.176062 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.176027 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4"} err="failed to get container status \"54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4\": rpc error: code = NotFound desc = could not find container \"54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4\": container with ID starting with 54d51b2717233c9fff6309e300e5637a6041f76492e97b7f5f381c9225d422b4 not found: ID does not exist" Apr 24 22:26:36.176062 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.176047 2571 scope.go:117] "RemoveContainer" containerID="f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264" Apr 24 22:26:36.176268 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:26:36.176250 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264\": container with ID starting with f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264 not found: ID does not exist" containerID="f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264" Apr 24 22:26:36.176328 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:36.176273 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264"} err="failed to get container status \"f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264\": rpc error: code = NotFound desc = could not find container \"f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264\": container with ID starting with f61f9406cd430d0ae43f0279be0edb96a03566b5d09e1bf4c2255f69011aa264 not found: ID does not exist" Apr 24 22:26:37.647686 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:37.647655 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" path="/var/lib/kubelet/pods/5d5faeda-c627-4072-8c9d-28dd0dad36b3/volumes" Apr 24 22:26:41.151333 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:41.151285 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:26:41.151860 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:41.151833 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:26:51.151806 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:26:51.151768 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:27:01.151941 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:01.151894 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:27:11.152009 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:11.151973 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:27:21.152586 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:21.152548 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:27:31.151980 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:31.151938 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 22:27:41.153014 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:41.152983 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:27:51.285155 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:51.285124 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc"] Apr 24 22:27:51.285777 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:51.285429 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" containerID="cri-o://31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4" gracePeriod=30 Apr 24 22:27:51.285777 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:51.285498 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kube-rbac-proxy" containerID="cri-o://8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10" gracePeriod=30 Apr 24 22:27:52.347923 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.347889 2571 generic.go:358] "Generic (PLEG): container finished" podID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerID="8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10" exitCode=2 Apr 24 22:27:52.348276 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.347930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerDied","Data":"8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10"} Apr 24 22:27:52.359917 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.359883 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq"] Apr 24 22:27:52.360207 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360192 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kube-rbac-proxy" Apr 24 22:27:52.360280 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360210 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kube-rbac-proxy" Apr 24 22:27:52.360280 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360239 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="storage-initializer" Apr 24 22:27:52.360280 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360248 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="storage-initializer" Apr 24 22:27:52.360280 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360259 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" Apr 24 22:27:52.360280 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360268 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" Apr 24 22:27:52.360573 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360363 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kube-rbac-proxy" Apr 24 22:27:52.360573 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.360380 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d5faeda-c627-4072-8c9d-28dd0dad36b3" containerName="kserve-container" Apr 24 22:27:52.363498 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.363481 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.365785 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.365766 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 24 22:27:52.365878 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.365766 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:27:52.375804 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.375785 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq"] Apr 24 22:27:52.536503 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.536455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.536679 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.536522 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chk7\" (UniqueName: \"kubernetes.io/projected/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kube-api-access-6chk7\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.536679 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.536557 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.536679 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.536604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.638097 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.637999 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.638097 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.638082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.638097 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.638102 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chk7\" (UniqueName: \"kubernetes.io/projected/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kube-api-access-6chk7\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.638415 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.638123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.638415 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:27:52.638247 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 24 22:27:52.638415 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:27:52.638344 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls podName:dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8 nodeName:}" failed. No retries permitted until 2026-04-24 22:27:53.138324868 +0000 UTC m=+3673.986368962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" (UID: "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 24 22:27:52.638575 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.638556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.638773 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.638756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:52.646690 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:52.646668 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chk7\" (UniqueName: \"kubernetes.io/projected/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kube-api-access-6chk7\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:53.142640 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:53.142607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:53.145313 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:53.145270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:53.272469 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:53.272431 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:27:53.393163 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:53.393061 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq"] Apr 24 22:27:53.395719 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:27:53.395688 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa7b3af_7f5f_4e0d_bfc5_34fc4c2a10c8.slice/crio-bac632738afbaeec338b1c89702301d22cf8e34bd5e7ca3c19e94d5d2ece70f0 WatchSource:0}: Error finding container bac632738afbaeec338b1c89702301d22cf8e34bd5e7ca3c19e94d5d2ece70f0: Status 404 returned error can't find the container with id bac632738afbaeec338b1c89702301d22cf8e34bd5e7ca3c19e94d5d2ece70f0 Apr 24 22:27:54.354696 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:54.354664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" event={"ID":"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8","Type":"ContainerStarted","Data":"90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f"} Apr 24 22:27:54.354696 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:54.354700 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" event={"ID":"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8","Type":"ContainerStarted","Data":"bac632738afbaeec338b1c89702301d22cf8e34bd5e7ca3c19e94d5d2ece70f0"} Apr 24 22:27:55.619960 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.619940 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:27:55.763702 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.763655 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"87bb2a02-dd40-457e-9705-e3818f41d37e\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " Apr 24 22:27:55.763883 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.763727 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrlqc\" (UniqueName: \"kubernetes.io/projected/87bb2a02-dd40-457e-9705-e3818f41d37e-kube-api-access-mrlqc\") pod \"87bb2a02-dd40-457e-9705-e3818f41d37e\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " Apr 24 22:27:55.763883 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.763758 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-cabundle-cert\") pod \"87bb2a02-dd40-457e-9705-e3818f41d37e\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " Apr 24 22:27:55.763964 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.763913 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87bb2a02-dd40-457e-9705-e3818f41d37e-kserve-provision-location\") pod \"87bb2a02-dd40-457e-9705-e3818f41d37e\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " Apr 24 22:27:55.764018 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.763977 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87bb2a02-dd40-457e-9705-e3818f41d37e-proxy-tls\") pod \"87bb2a02-dd40-457e-9705-e3818f41d37e\" (UID: \"87bb2a02-dd40-457e-9705-e3818f41d37e\") " Apr 24 22:27:55.764126 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.764094 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "87bb2a02-dd40-457e-9705-e3818f41d37e" (UID: "87bb2a02-dd40-457e-9705-e3818f41d37e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:27:55.764209 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.764128 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "87bb2a02-dd40-457e-9705-e3818f41d37e" (UID: "87bb2a02-dd40-457e-9705-e3818f41d37e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:27:55.764267 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.764225 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87bb2a02-dd40-457e-9705-e3818f41d37e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87bb2a02-dd40-457e-9705-e3818f41d37e" (UID: "87bb2a02-dd40-457e-9705-e3818f41d37e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:27:55.764267 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.764243 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-cabundle-cert\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:27:55.764267 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.764256 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87bb2a02-dd40-457e-9705-e3818f41d37e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:27:55.766043 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.766022 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87bb2a02-dd40-457e-9705-e3818f41d37e-kube-api-access-mrlqc" (OuterVolumeSpecName: "kube-api-access-mrlqc") pod "87bb2a02-dd40-457e-9705-e3818f41d37e" (UID: "87bb2a02-dd40-457e-9705-e3818f41d37e"). InnerVolumeSpecName "kube-api-access-mrlqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:27:55.766191 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.766169 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87bb2a02-dd40-457e-9705-e3818f41d37e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87bb2a02-dd40-457e-9705-e3818f41d37e" (UID: "87bb2a02-dd40-457e-9705-e3818f41d37e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:27:55.865599 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.865551 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrlqc\" (UniqueName: \"kubernetes.io/projected/87bb2a02-dd40-457e-9705-e3818f41d37e-kube-api-access-mrlqc\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:27:55.865599 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.865590 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87bb2a02-dd40-457e-9705-e3818f41d37e-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:27:55.865599 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:55.865600 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87bb2a02-dd40-457e-9705-e3818f41d37e-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:27:56.362205 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.362176 2571 generic.go:358] "Generic (PLEG): container finished" podID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerID="31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4" exitCode=0 Apr 24 22:27:56.362402 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.362238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerDied","Data":"31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4"} Apr 24 22:27:56.362402 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.362255 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" Apr 24 22:27:56.362402 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.362273 2571 scope.go:117] "RemoveContainer" containerID="8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10" Apr 24 22:27:56.362402 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.362263 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc" event={"ID":"87bb2a02-dd40-457e-9705-e3818f41d37e","Type":"ContainerDied","Data":"09f2d204d4523d4b04c548f210f61dc929279d2b3f59d395283a5cf382ff163c"} Apr 24 22:27:56.370124 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.370104 2571 scope.go:117] "RemoveContainer" containerID="31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4" Apr 24 22:27:56.376775 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.376754 2571 scope.go:117] "RemoveContainer" containerID="870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9" Apr 24 22:27:56.383423 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.383406 2571 scope.go:117] "RemoveContainer" containerID="8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10" Apr 24 22:27:56.383701 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:27:56.383681 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10\": container with ID starting with 8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10 not found: ID does not exist" containerID="8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10" Apr 24 22:27:56.383788 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.383708 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10"} err="failed to get container status \"8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10\": rpc error: code = NotFound desc = could not find container \"8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10\": container with ID starting with 8111dc557535b15b87c5a14c84d670913417714e7fb75d692e6d8ea62d7d0a10 not found: ID does not exist" Apr 24 22:27:56.383788 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.383727 2571 scope.go:117] "RemoveContainer" containerID="31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4" Apr 24 22:27:56.383967 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:27:56.383940 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4\": container with ID starting with 31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4 not found: ID does not exist" containerID="31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4" Apr 24 22:27:56.384010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.383978 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4"} err="failed to get container status \"31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4\": rpc error: code = NotFound desc = could not find container \"31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4\": container with ID starting with 31323d2d29f056b309e9dfdbb1e288469ad8291e64f235f19b49a5539b38a2c4 not found: ID does not exist" Apr 24 22:27:56.384010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.384001 2571 scope.go:117] "RemoveContainer" containerID="870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9" Apr 24 22:27:56.384180 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.384167 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc"] Apr 24 22:27:56.384241 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:27:56.384219 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9\": container with ID starting with 870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9 not found: ID does not exist" containerID="870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9" Apr 24 22:27:56.384279 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.384237 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9"} err="failed to get container status \"870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9\": rpc error: code = NotFound desc = could not find container \"870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9\": container with ID starting with 870c4620a4294c2c2605b8d710e0a0429a28160a4dc6f979864a70848fe6eaf9 not found: ID does not exist" Apr 24 22:27:56.390002 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:56.389984 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-666f4c58c-2rgjc"] Apr 24 22:27:57.647273 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:57.647245 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" path="/var/lib/kubelet/pods/87bb2a02-dd40-457e-9705-e3818f41d37e/volumes" Apr 24 22:27:58.370471 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:58.370406 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/storage-initializer/0.log" Apr 24 22:27:58.370471 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:58.370441 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerID="90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f" exitCode=1 Apr 24 22:27:58.370655 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:58.370492 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" event={"ID":"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8","Type":"ContainerDied","Data":"90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f"} Apr 24 22:27:59.374771 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:59.374744 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/storage-initializer/0.log" Apr 24 22:27:59.375152 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:27:59.374807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" event={"ID":"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8","Type":"ContainerStarted","Data":"2437d5edfe2c0e8b1b1c332b16d6fc0c1d7ac8f0e6f915400bdd0ce37d647f1d"} Apr 24 22:28:01.381806 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:01.381778 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/storage-initializer/1.log" Apr 24 22:28:01.382197 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:01.382100 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/storage-initializer/0.log" Apr 24 22:28:01.382197 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:01.382132 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerID="2437d5edfe2c0e8b1b1c332b16d6fc0c1d7ac8f0e6f915400bdd0ce37d647f1d" exitCode=1 Apr 24 22:28:01.382197 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:01.382189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" event={"ID":"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8","Type":"ContainerDied","Data":"2437d5edfe2c0e8b1b1c332b16d6fc0c1d7ac8f0e6f915400bdd0ce37d647f1d"} Apr 24 22:28:01.382346 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:01.382220 2571 scope.go:117] "RemoveContainer" containerID="90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f" Apr 24 22:28:01.382712 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:01.382693 2571 scope.go:117] "RemoveContainer" containerID="90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f" Apr 24 22:28:01.392744 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:28:01.392705 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_kserve-ci-e2e-test_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8_0 in pod sandbox bac632738afbaeec338b1c89702301d22cf8e34bd5e7ca3c19e94d5d2ece70f0 from index: no such id: '90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f'" containerID="90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f" Apr 24 22:28:01.392854 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:28:01.392758 2571 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_kserve-ci-e2e-test_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8_0 in pod sandbox bac632738afbaeec338b1c89702301d22cf8e34bd5e7ca3c19e94d5d2ece70f0 from index: no such id: '90fa20540e5d46842dce3a9bdbb9331456f9db933a5afa0795020e5895ac798f'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_kserve-ci-e2e-test(dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8)\"" logger="UnhandledError" Apr 24 22:28:01.394059 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:28:01.394041 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_kserve-ci-e2e-test(dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" Apr 24 22:28:02.357664 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.357633 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq"] Apr 24 22:28:02.388500 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.388470 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/storage-initializer/1.log" Apr 24 22:28:02.509195 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.509173 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/storage-initializer/1.log" Apr 24 22:28:02.509332 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.509238 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:28:02.517283 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.517266 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls\") pod \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " Apr 24 22:28:02.517387 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.517370 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " Apr 24 22:28:02.517428 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.517410 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6chk7\" (UniqueName: \"kubernetes.io/projected/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kube-api-access-6chk7\") pod \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " Apr 24 22:28:02.517464 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.517443 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kserve-provision-location\") pod \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\" (UID: \"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8\") " Apr 24 22:28:02.517730 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.517707 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" (UID: "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:28:02.517981 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.517729 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" (UID: "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:28:02.519278 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.519258 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" (UID: "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:28:02.519436 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.519419 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kube-api-access-6chk7" (OuterVolumeSpecName: "kube-api-access-6chk7") pod "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" (UID: "dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8"). InnerVolumeSpecName "kube-api-access-6chk7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:28:02.618527 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.618453 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:28:02.618527 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.618478 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:28:02.618527 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.618491 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6chk7\" (UniqueName: \"kubernetes.io/projected/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kube-api-access-6chk7\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:28:02.618527 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:02.618502 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:28:03.392803 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.392769 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq_dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/storage-initializer/1.log" Apr 24 22:28:03.393203 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.392849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" event={"ID":"dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8","Type":"ContainerDied","Data":"bac632738afbaeec338b1c89702301d22cf8e34bd5e7ca3c19e94d5d2ece70f0"} Apr 24 22:28:03.393203 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.392892 2571 scope.go:117] "RemoveContainer" containerID="2437d5edfe2c0e8b1b1c332b16d6fc0c1d7ac8f0e6f915400bdd0ce37d647f1d" Apr 24 22:28:03.393203 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.392907 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq" Apr 24 22:28:03.440437 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.440399 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq"] Apr 24 22:28:03.442194 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442167 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp"] Apr 24 22:28:03.442456 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442443 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="storage-initializer" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442458 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="storage-initializer" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442468 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerName="storage-initializer" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442477 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerName="storage-initializer" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442485 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerName="storage-initializer" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442490 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerName="storage-initializer" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442498 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442503 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442509 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kube-rbac-proxy" Apr 24 22:28:03.442511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442514 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kube-rbac-proxy" Apr 24 22:28:03.442776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442563 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kserve-container" Apr 24 22:28:03.442776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442574 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerName="storage-initializer" Apr 24 22:28:03.442776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442580 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" containerName="storage-initializer" Apr 24 22:28:03.442776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.442586 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="87bb2a02-dd40-457e-9705-e3818f41d37e" containerName="kube-rbac-proxy" Apr 24 22:28:03.446866 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.446848 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.453793 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.453773 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:28:03.453931 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.453773 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 24 22:28:03.453931 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.453870 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:28:03.454058 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.453943 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:28:03.454119 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.454062 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 22:28:03.454674 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.454654 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:28:03.454923 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.454910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-j8kq7\"" Apr 24 22:28:03.455058 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.455043 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-9c6dddd45-6bvqq"] Apr 24 22:28:03.460254 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.460236 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp"] Apr 24 22:28:03.524807 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.524772 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.524946 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.524828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4l5v\" (UniqueName: \"kubernetes.io/projected/cb00da89-79ff-492a-9fff-7325fbbde32b-kube-api-access-r4l5v\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.524946 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.524920 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.525073 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.524961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb00da89-79ff-492a-9fff-7325fbbde32b-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.525073 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.525018 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb00da89-79ff-492a-9fff-7325fbbde32b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.625725 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.625689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4l5v\" (UniqueName: \"kubernetes.io/projected/cb00da89-79ff-492a-9fff-7325fbbde32b-kube-api-access-r4l5v\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.625877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.625769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.625877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.625799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb00da89-79ff-492a-9fff-7325fbbde32b-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.625877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.625835 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb00da89-79ff-492a-9fff-7325fbbde32b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.625877 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.625863 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.626362 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.626336 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb00da89-79ff-492a-9fff-7325fbbde32b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.626487 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.626471 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.626539 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.626489 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.628457 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.628442 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb00da89-79ff-492a-9fff-7325fbbde32b-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.634807 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.634779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4l5v\" (UniqueName: \"kubernetes.io/projected/cb00da89-79ff-492a-9fff-7325fbbde32b-kube-api-access-r4l5v\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.647953 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.647899 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8" path="/var/lib/kubelet/pods/dfa7b3af-7f5f-4e0d-bfc5-34fc4c2a10c8/volumes" Apr 24 22:28:03.757945 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.757892 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:03.876522 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:03.876489 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp"] Apr 24 22:28:03.879414 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:28:03.879385 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb00da89_79ff_492a_9fff_7325fbbde32b.slice/crio-7d676b7d92c5edaceee3db3051c25c1932e758cd2462d684ff3aa2463e22ac21 WatchSource:0}: Error finding container 7d676b7d92c5edaceee3db3051c25c1932e758cd2462d684ff3aa2463e22ac21: Status 404 returned error can't find the container with id 7d676b7d92c5edaceee3db3051c25c1932e758cd2462d684ff3aa2463e22ac21 Apr 24 22:28:04.397553 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:04.397509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerStarted","Data":"0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92"} Apr 24 22:28:04.397553 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:04.397548 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerStarted","Data":"7d676b7d92c5edaceee3db3051c25c1932e758cd2462d684ff3aa2463e22ac21"} Apr 24 22:28:05.402587 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:05.402555 2571 generic.go:358] "Generic (PLEG): container finished" podID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerID="0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92" exitCode=0 Apr 24 22:28:05.402977 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:05.402620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerDied","Data":"0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92"} Apr 24 22:28:06.407532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:06.407492 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerStarted","Data":"397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29"} Apr 24 22:28:06.407532 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:06.407537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerStarted","Data":"74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9"} Apr 24 22:28:06.408046 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:06.407706 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:06.430012 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:06.429959 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podStartSLOduration=3.429945967 podStartE2EDuration="3.429945967s" podCreationTimestamp="2026-04-24 22:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:28:06.428205034 +0000 UTC m=+3687.276249148" watchObservedRunningTime="2026-04-24 22:28:06.429945967 +0000 UTC m=+3687.277990080" Apr 24 22:28:07.410798 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:07.410761 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:07.411734 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:07.411706 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:28:08.413362 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:08.413320 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:28:13.418076 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:13.418043 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:28:13.418641 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:13.418613 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:28:23.418898 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:23.418853 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:28:33.419497 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:33.419460 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:28:43.419264 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:43.419226 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:28:53.419397 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:28:53.419358 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:29:03.418676 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:03.418638 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 22:29:13.419462 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:13.419432 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:29:23.505246 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:23.505211 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp"] Apr 24 22:29:23.505678 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:23.505537 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" containerID="cri-o://74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9" gracePeriod=30 Apr 24 22:29:23.505678 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:23.505570 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kube-rbac-proxy" containerID="cri-o://397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29" gracePeriod=30 Apr 24 22:29:23.620719 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:23.620690 2571 generic.go:358] "Generic (PLEG): container finished" podID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerID="397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29" exitCode=2 Apr 24 22:29:23.620856 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:23.620727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerDied","Data":"397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29"} Apr 24 22:29:24.584343 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.584310 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q"] Apr 24 22:29:24.587648 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.587630 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.589934 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.589912 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 24 22:29:24.590190 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.590170 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:29:24.596556 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.596532 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q"] Apr 24 22:29:24.627106 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.627073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e1cf513f-785c-451b-86b1-34feb6d94b1a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.627248 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.627124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1cf513f-785c-451b-86b1-34feb6d94b1a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.627248 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.627162 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cf513f-785c-451b-86b1-34feb6d94b1a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.627248 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.627225 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8lf2\" (UniqueName: \"kubernetes.io/projected/e1cf513f-785c-451b-86b1-34feb6d94b1a-kube-api-access-z8lf2\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.728468 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.728434 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e1cf513f-785c-451b-86b1-34feb6d94b1a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.728689 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.728484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1cf513f-785c-451b-86b1-34feb6d94b1a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.728689 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.728520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cf513f-785c-451b-86b1-34feb6d94b1a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.728689 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.728546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8lf2\" (UniqueName: \"kubernetes.io/projected/e1cf513f-785c-451b-86b1-34feb6d94b1a-kube-api-access-z8lf2\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.729036 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.729015 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cf513f-785c-451b-86b1-34feb6d94b1a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.729213 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.729192 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e1cf513f-785c-451b-86b1-34feb6d94b1a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.731169 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.731154 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1cf513f-785c-451b-86b1-34feb6d94b1a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.737506 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.737481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8lf2\" (UniqueName: \"kubernetes.io/projected/e1cf513f-785c-451b-86b1-34feb6d94b1a-kube-api-access-z8lf2\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:24.898811 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:24.898719 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:25.021596 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:25.021562 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q"] Apr 24 22:29:25.024763 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:29:25.024700 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1cf513f_785c_451b_86b1_34feb6d94b1a.slice/crio-3a6e752805164d8d8bb0b8ffd2e9fc821d7d1c047672df69e74ecf773a800b22 WatchSource:0}: Error finding container 3a6e752805164d8d8bb0b8ffd2e9fc821d7d1c047672df69e74ecf773a800b22: Status 404 returned error can't find the container with id 3a6e752805164d8d8bb0b8ffd2e9fc821d7d1c047672df69e74ecf773a800b22 Apr 24 22:29:25.026964 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:25.026943 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:25.627250 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:25.627206 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" event={"ID":"e1cf513f-785c-451b-86b1-34feb6d94b1a","Type":"ContainerStarted","Data":"9ffc31b79fb1e41c27cb622cbcb9630e8d3939aefb970db0e292728c4efb3d67"} Apr 24 22:29:25.627250 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:25.627242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" event={"ID":"e1cf513f-785c-451b-86b1-34feb6d94b1a","Type":"ContainerStarted","Data":"3a6e752805164d8d8bb0b8ffd2e9fc821d7d1c047672df69e74ecf773a800b22"} Apr 24 22:29:27.851693 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.851671 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:29:27.951181 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951092 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb00da89-79ff-492a-9fff-7325fbbde32b-kserve-provision-location\") pod \"cb00da89-79ff-492a-9fff-7325fbbde32b\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " Apr 24 22:29:27.951181 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951145 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-cabundle-cert\") pod \"cb00da89-79ff-492a-9fff-7325fbbde32b\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " Apr 24 22:29:27.951181 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951178 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"cb00da89-79ff-492a-9fff-7325fbbde32b\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " Apr 24 22:29:27.951494 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951197 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4l5v\" (UniqueName: \"kubernetes.io/projected/cb00da89-79ff-492a-9fff-7325fbbde32b-kube-api-access-r4l5v\") pod \"cb00da89-79ff-492a-9fff-7325fbbde32b\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " Apr 24 22:29:27.951494 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951229 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb00da89-79ff-492a-9fff-7325fbbde32b-proxy-tls\") pod \"cb00da89-79ff-492a-9fff-7325fbbde32b\" (UID: \"cb00da89-79ff-492a-9fff-7325fbbde32b\") " Apr 24 22:29:27.951605 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951542 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb00da89-79ff-492a-9fff-7325fbbde32b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cb00da89-79ff-492a-9fff-7325fbbde32b" (UID: "cb00da89-79ff-492a-9fff-7325fbbde32b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:29:27.951647 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951621 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "cb00da89-79ff-492a-9fff-7325fbbde32b" (UID: "cb00da89-79ff-492a-9fff-7325fbbde32b"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:29:27.951683 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.951639 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "cb00da89-79ff-492a-9fff-7325fbbde32b" (UID: "cb00da89-79ff-492a-9fff-7325fbbde32b"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:29:27.953511 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.953490 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb00da89-79ff-492a-9fff-7325fbbde32b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cb00da89-79ff-492a-9fff-7325fbbde32b" (UID: "cb00da89-79ff-492a-9fff-7325fbbde32b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:29:27.953595 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:27.953513 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb00da89-79ff-492a-9fff-7325fbbde32b-kube-api-access-r4l5v" (OuterVolumeSpecName: "kube-api-access-r4l5v") pod "cb00da89-79ff-492a-9fff-7325fbbde32b" (UID: "cb00da89-79ff-492a-9fff-7325fbbde32b"). InnerVolumeSpecName "kube-api-access-r4l5v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:29:28.051779 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.051749 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-cabundle-cert\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:28.051779 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.051774 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb00da89-79ff-492a-9fff-7325fbbde32b-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:28.051957 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.051789 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4l5v\" (UniqueName: \"kubernetes.io/projected/cb00da89-79ff-492a-9fff-7325fbbde32b-kube-api-access-r4l5v\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:28.051957 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.051802 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb00da89-79ff-492a-9fff-7325fbbde32b-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:28.051957 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.051813 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb00da89-79ff-492a-9fff-7325fbbde32b-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:28.637747 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.637716 2571 generic.go:358] "Generic (PLEG): container finished" podID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerID="74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9" exitCode=0 Apr 24 22:29:28.637747 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.637751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerDied","Data":"74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9"} Apr 24 22:29:28.637998 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.637774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" event={"ID":"cb00da89-79ff-492a-9fff-7325fbbde32b","Type":"ContainerDied","Data":"7d676b7d92c5edaceee3db3051c25c1932e758cd2462d684ff3aa2463e22ac21"} Apr 24 22:29:28.637998 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.637788 2571 scope.go:117] "RemoveContainer" containerID="397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29" Apr 24 22:29:28.637998 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.637799 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp" Apr 24 22:29:28.645634 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.645618 2571 scope.go:117] "RemoveContainer" containerID="74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9" Apr 24 22:29:28.652181 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.652167 2571 scope.go:117] "RemoveContainer" containerID="0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92" Apr 24 22:29:28.658764 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.658743 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp"] Apr 24 22:29:28.659408 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.659389 2571 scope.go:117] "RemoveContainer" containerID="397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29" Apr 24 22:29:28.659792 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:29:28.659762 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29\": container with ID starting with 397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29 not found: ID does not exist" containerID="397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29" Apr 24 22:29:28.659938 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.659803 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29"} err="failed to get container status \"397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29\": rpc error: code = NotFound desc = could not find container \"397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29\": container with ID starting with 397a1fc5ea701375fd51226cb4ea1609d7dde4d59e048f72b527b745cea9fa29 not found: ID does not exist" Apr 24 22:29:28.659938 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.659829 2571 scope.go:117] "RemoveContainer" containerID="74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9" Apr 24 22:29:28.660237 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:29:28.660212 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9\": container with ID starting with 74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9 not found: ID does not exist" containerID="74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9" Apr 24 22:29:28.660340 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.660246 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9"} err="failed to get container status \"74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9\": rpc error: code = NotFound desc = could not find container \"74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9\": container with ID starting with 74e67e2ebf1e9ccec09c865bd3d4eacc6bd5324a35c2dd613b42cea1bcfda3a9 not found: ID does not exist" Apr 24 22:29:28.660340 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.660267 2571 scope.go:117] "RemoveContainer" containerID="0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92" Apr 24 22:29:28.660568 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:29:28.660553 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92\": container with ID starting with 0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92 not found: ID does not exist" containerID="0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92" Apr 24 22:29:28.660634 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.660571 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92"} err="failed to get container status \"0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92\": rpc error: code = NotFound desc = could not find container \"0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92\": container with ID starting with 0010751a59f5fe00247027f1c132e057966da4798b50db64123c4ce1264eef92 not found: ID does not exist" Apr 24 22:29:28.661517 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:28.661501 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7bd5b8d64c-wcjcp"] Apr 24 22:29:29.647250 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:29.647218 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" path="/var/lib/kubelet/pods/cb00da89-79ff-492a-9fff-7325fbbde32b/volumes" Apr 24 22:29:30.645388 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:30.645314 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q_e1cf513f-785c-451b-86b1-34feb6d94b1a/storage-initializer/0.log" Apr 24 22:29:30.645388 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:30.645353 2571 generic.go:358] "Generic (PLEG): container finished" podID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerID="9ffc31b79fb1e41c27cb622cbcb9630e8d3939aefb970db0e292728c4efb3d67" exitCode=1 Apr 24 22:29:30.645388 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:30.645380 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" event={"ID":"e1cf513f-785c-451b-86b1-34feb6d94b1a","Type":"ContainerDied","Data":"9ffc31b79fb1e41c27cb622cbcb9630e8d3939aefb970db0e292728c4efb3d67"} Apr 24 22:29:31.650386 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:31.650358 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q_e1cf513f-785c-451b-86b1-34feb6d94b1a/storage-initializer/0.log" Apr 24 22:29:31.650767 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:31.650420 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" event={"ID":"e1cf513f-785c-451b-86b1-34feb6d94b1a","Type":"ContainerStarted","Data":"6d6cfddcc99eb37d3b48621a1b37bb953db789d5a34cfb0a677cc55403aeb524"} Apr 24 22:29:34.561209 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:34.561176 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q"] Apr 24 22:29:34.561593 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:34.561472 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerName="storage-initializer" containerID="cri-o://6d6cfddcc99eb37d3b48621a1b37bb953db789d5a34cfb0a677cc55403aeb524" gracePeriod=30 Apr 24 22:29:35.637146 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637112 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px"] Apr 24 22:29:35.637615 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637595 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kube-rbac-proxy" Apr 24 22:29:35.637659 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637620 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kube-rbac-proxy" Apr 24 22:29:35.637659 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637633 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" Apr 24 22:29:35.637659 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637642 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" Apr 24 22:29:35.637659 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637656 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="storage-initializer" Apr 24 22:29:35.637776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637665 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="storage-initializer" Apr 24 22:29:35.637776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637745 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kserve-container" Apr 24 22:29:35.637776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.637759 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb00da89-79ff-492a-9fff-7325fbbde32b" containerName="kube-rbac-proxy" Apr 24 22:29:35.641101 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.641069 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.643653 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.643613 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 24 22:29:35.643653 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.643626 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:29:35.643653 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.643640 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:29:35.653224 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.653195 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px"] Apr 24 22:29:35.663160 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.663140 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q_e1cf513f-785c-451b-86b1-34feb6d94b1a/storage-initializer/1.log" Apr 24 22:29:35.663554 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.663525 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q_e1cf513f-785c-451b-86b1-34feb6d94b1a/storage-initializer/0.log" Apr 24 22:29:35.663814 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.663564 2571 generic.go:358] "Generic (PLEG): container finished" podID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerID="6d6cfddcc99eb37d3b48621a1b37bb953db789d5a34cfb0a677cc55403aeb524" exitCode=1 Apr 24 22:29:35.663814 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.663616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" event={"ID":"e1cf513f-785c-451b-86b1-34feb6d94b1a","Type":"ContainerDied","Data":"6d6cfddcc99eb37d3b48621a1b37bb953db789d5a34cfb0a677cc55403aeb524"} Apr 24 22:29:35.663814 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.663648 2571 scope.go:117] "RemoveContainer" containerID="9ffc31b79fb1e41c27cb622cbcb9630e8d3939aefb970db0e292728c4efb3d67" Apr 24 22:29:35.710224 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.710207 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q_e1cf513f-785c-451b-86b1-34feb6d94b1a/storage-initializer/1.log" Apr 24 22:29:35.710348 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.710268 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:35.710415 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.710396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.710464 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.710431 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3d0859-620d-4704-bbab-cc63fceda526-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.710464 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.710458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.710561 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.710515 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfhcz\" (UniqueName: \"kubernetes.io/projected/fc3d0859-620d-4704-bbab-cc63fceda526-kube-api-access-mfhcz\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.710595 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.710564 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3d0859-620d-4704-bbab-cc63fceda526-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.811416 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811385 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8lf2\" (UniqueName: \"kubernetes.io/projected/e1cf513f-785c-451b-86b1-34feb6d94b1a-kube-api-access-z8lf2\") pod \"e1cf513f-785c-451b-86b1-34feb6d94b1a\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " Apr 24 22:29:35.811563 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811471 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cf513f-785c-451b-86b1-34feb6d94b1a-kserve-provision-location\") pod \"e1cf513f-785c-451b-86b1-34feb6d94b1a\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " Apr 24 22:29:35.811563 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811549 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1cf513f-785c-451b-86b1-34feb6d94b1a-proxy-tls\") pod \"e1cf513f-785c-451b-86b1-34feb6d94b1a\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " Apr 24 22:29:35.811651 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811593 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e1cf513f-785c-451b-86b1-34feb6d94b1a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"e1cf513f-785c-451b-86b1-34feb6d94b1a\" (UID: \"e1cf513f-785c-451b-86b1-34feb6d94b1a\") " Apr 24 22:29:35.811714 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811696 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1cf513f-785c-451b-86b1-34feb6d94b1a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e1cf513f-785c-451b-86b1-34feb6d94b1a" (UID: "e1cf513f-785c-451b-86b1-34feb6d94b1a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:29:35.811776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.811776 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3d0859-620d-4704-bbab-cc63fceda526-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.811918 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.811974 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.811950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfhcz\" (UniqueName: \"kubernetes.io/projected/fc3d0859-620d-4704-bbab-cc63fceda526-kube-api-access-mfhcz\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.812024 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.812009 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3d0859-620d-4704-bbab-cc63fceda526-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.812136 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.812088 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cf513f-785c-451b-86b1-34feb6d94b1a-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:35.812136 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.812006 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cf513f-785c-451b-86b1-34feb6d94b1a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "e1cf513f-785c-451b-86b1-34feb6d94b1a" (UID: "e1cf513f-785c-451b-86b1-34feb6d94b1a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:29:35.812520 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.812491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.812673 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.812651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.812873 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.812849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3d0859-620d-4704-bbab-cc63fceda526-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.813709 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.813681 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1cf513f-785c-451b-86b1-34feb6d94b1a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e1cf513f-785c-451b-86b1-34feb6d94b1a" (UID: "e1cf513f-785c-451b-86b1-34feb6d94b1a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:29:35.813824 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.813751 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1cf513f-785c-451b-86b1-34feb6d94b1a-kube-api-access-z8lf2" (OuterVolumeSpecName: "kube-api-access-z8lf2") pod "e1cf513f-785c-451b-86b1-34feb6d94b1a" (UID: "e1cf513f-785c-451b-86b1-34feb6d94b1a"). InnerVolumeSpecName "kube-api-access-z8lf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:29:35.814372 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.814331 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3d0859-620d-4704-bbab-cc63fceda526-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.824280 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.824236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfhcz\" (UniqueName: \"kubernetes.io/projected/fc3d0859-620d-4704-bbab-cc63fceda526-kube-api-access-mfhcz\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:35.913037 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.912981 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1cf513f-785c-451b-86b1-34feb6d94b1a-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:35.913037 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.913023 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e1cf513f-785c-451b-86b1-34feb6d94b1a-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:35.913037 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.913040 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8lf2\" (UniqueName: \"kubernetes.io/projected/e1cf513f-785c-451b-86b1-34feb6d94b1a-kube-api-access-z8lf2\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:29:35.953512 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:35.953485 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:36.070455 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.070431 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px"] Apr 24 22:29:36.073094 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:29:36.073068 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc3d0859_620d_4704_bbab_cc63fceda526.slice/crio-3bf10cc3a7a3307830dbd3fd9d87b7a33af25696cd93995a83091af2bec15bd0 WatchSource:0}: Error finding container 3bf10cc3a7a3307830dbd3fd9d87b7a33af25696cd93995a83091af2bec15bd0: Status 404 returned error can't find the container with id 3bf10cc3a7a3307830dbd3fd9d87b7a33af25696cd93995a83091af2bec15bd0 Apr 24 22:29:36.667470 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.667444 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q_e1cf513f-785c-451b-86b1-34feb6d94b1a/storage-initializer/1.log" Apr 24 22:29:36.667902 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.667538 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" event={"ID":"e1cf513f-785c-451b-86b1-34feb6d94b1a","Type":"ContainerDied","Data":"3a6e752805164d8d8bb0b8ffd2e9fc821d7d1c047672df69e74ecf773a800b22"} Apr 24 22:29:36.667902 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.667573 2571 scope.go:117] "RemoveContainer" containerID="6d6cfddcc99eb37d3b48621a1b37bb953db789d5a34cfb0a677cc55403aeb524" Apr 24 22:29:36.667902 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.667619 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q" Apr 24 22:29:36.669047 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.669020 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerStarted","Data":"f785af3b59a5a031af858f43cf60f9a3866a63dd775ebe0a51637210ba2dc15f"} Apr 24 22:29:36.669126 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.669056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerStarted","Data":"3bf10cc3a7a3307830dbd3fd9d87b7a33af25696cd93995a83091af2bec15bd0"} Apr 24 22:29:36.719823 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.719789 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q"] Apr 24 22:29:36.723795 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:36.723767 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58f5875f45-pnb8q"] Apr 24 22:29:37.647860 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:37.647827 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" path="/var/lib/kubelet/pods/e1cf513f-785c-451b-86b1-34feb6d94b1a/volumes" Apr 24 22:29:37.672967 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:37.672939 2571 generic.go:358] "Generic (PLEG): container finished" podID="fc3d0859-620d-4704-bbab-cc63fceda526" containerID="f785af3b59a5a031af858f43cf60f9a3866a63dd775ebe0a51637210ba2dc15f" exitCode=0 Apr 24 22:29:37.673279 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:37.673025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerDied","Data":"f785af3b59a5a031af858f43cf60f9a3866a63dd775ebe0a51637210ba2dc15f"} Apr 24 22:29:38.676725 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:38.676692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerStarted","Data":"71f94b7dc684acd6047da5a4db5ee9ae16fd20427c66cb055031723fea221632"} Apr 24 22:29:38.676725 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:38.676722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerStarted","Data":"8b68caa0c075cb2103fc7800f6204fc481b9c55a75ce8c1bed3e6a043b74819f"} Apr 24 22:29:38.677248 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:38.676849 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:38.697581 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:38.697545 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podStartSLOduration=3.69753293 podStartE2EDuration="3.69753293s" podCreationTimestamp="2026-04-24 22:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:38.696360959 +0000 UTC m=+3779.544405073" watchObservedRunningTime="2026-04-24 22:29:38.69753293 +0000 UTC m=+3779.545577044" Apr 24 22:29:39.679804 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:39.679774 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:39.680902 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:39.680877 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:29:40.682806 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:40.682764 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:29:45.686810 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:45.686779 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:29:45.687456 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:45.687427 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:29:55.687498 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:29:55.687460 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:30:05.687393 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:05.687354 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:30:15.687883 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:15.687847 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:30:25.687722 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:25.687685 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:30:35.687712 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:35.687678 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 22:30:45.688454 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:45.688427 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:30:55.762121 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:55.762084 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px"] Apr 24 22:30:55.762528 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:55.762436 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" containerID="cri-o://8b68caa0c075cb2103fc7800f6204fc481b9c55a75ce8c1bed3e6a043b74819f" gracePeriod=30 Apr 24 22:30:55.762528 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:55.762476 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kube-rbac-proxy" containerID="cri-o://71f94b7dc684acd6047da5a4db5ee9ae16fd20427c66cb055031723fea221632" gracePeriod=30 Apr 24 22:30:55.885847 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:55.885815 2571 generic.go:358] "Generic (PLEG): container finished" podID="fc3d0859-620d-4704-bbab-cc63fceda526" containerID="71f94b7dc684acd6047da5a4db5ee9ae16fd20427c66cb055031723fea221632" exitCode=2 Apr 24 22:30:55.885978 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:55.885884 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerDied","Data":"71f94b7dc684acd6047da5a4db5ee9ae16fd20427c66cb055031723fea221632"} Apr 24 22:30:56.762177 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.762142 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2"] Apr 24 22:30:56.762598 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.762478 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerName="storage-initializer" Apr 24 22:30:56.762598 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.762489 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerName="storage-initializer" Apr 24 22:30:56.762598 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.762501 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerName="storage-initializer" Apr 24 22:30:56.762598 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.762507 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerName="storage-initializer" Apr 24 22:30:56.762598 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.762552 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerName="storage-initializer" Apr 24 22:30:56.762598 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.762561 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1cf513f-785c-451b-86b1-34feb6d94b1a" containerName="storage-initializer" Apr 24 22:30:56.765606 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.765587 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.768093 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.768066 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 24 22:30:56.768194 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.768098 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:30:56.775893 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.775873 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2"] Apr 24 22:30:56.817764 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.817741 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c2daf79-da69-4ce0-b95c-85356e6b3877-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.817875 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.817775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c2daf79-da69-4ce0-b95c-85356e6b3877-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.817875 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.817798 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffgn9\" (UniqueName: \"kubernetes.io/projected/2c2daf79-da69-4ce0-b95c-85356e6b3877-kube-api-access-ffgn9\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.817953 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.817888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.918214 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.918179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c2daf79-da69-4ce0-b95c-85356e6b3877-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.918419 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.918221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c2daf79-da69-4ce0-b95c-85356e6b3877-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.918419 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.918246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffgn9\" (UniqueName: \"kubernetes.io/projected/2c2daf79-da69-4ce0-b95c-85356e6b3877-kube-api-access-ffgn9\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.918419 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.918288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.918565 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:30:56.918426 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 22:30:56.918565 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:30:56.918474 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls podName:2c2daf79-da69-4ce0-b95c-85356e6b3877 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:57.418458152 +0000 UTC m=+3858.266502245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" (UID: "2c2daf79-da69-4ce0-b95c-85356e6b3877") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 22:30:56.918701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.918679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c2daf79-da69-4ce0-b95c-85356e6b3877-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.918921 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.918903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c2daf79-da69-4ce0-b95c-85356e6b3877-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:56.927157 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:56.927134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffgn9\" (UniqueName: \"kubernetes.io/projected/2c2daf79-da69-4ce0-b95c-85356e6b3877-kube-api-access-ffgn9\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:57.421657 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:57.421619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:57.424194 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:57.424172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:57.675517 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:57.675438 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:30:57.794837 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:57.794667 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2"] Apr 24 22:30:57.797313 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:30:57.797273 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2daf79_da69_4ce0_b95c_85356e6b3877.slice/crio-f370bc48f35e6e4a11026a30afced6f824d666c10971d9af5778b0afa2298d99 WatchSource:0}: Error finding container f370bc48f35e6e4a11026a30afced6f824d666c10971d9af5778b0afa2298d99: Status 404 returned error can't find the container with id f370bc48f35e6e4a11026a30afced6f824d666c10971d9af5778b0afa2298d99 Apr 24 22:30:57.892406 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:57.892374 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" event={"ID":"2c2daf79-da69-4ce0-b95c-85356e6b3877","Type":"ContainerStarted","Data":"fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c"} Apr 24 22:30:57.892515 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:57.892415 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" event={"ID":"2c2daf79-da69-4ce0-b95c-85356e6b3877","Type":"ContainerStarted","Data":"f370bc48f35e6e4a11026a30afced6f824d666c10971d9af5778b0afa2298d99"} Apr 24 22:30:59.903168 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.903137 2571 generic.go:358] "Generic (PLEG): container finished" podID="fc3d0859-620d-4704-bbab-cc63fceda526" containerID="8b68caa0c075cb2103fc7800f6204fc481b9c55a75ce8c1bed3e6a043b74819f" exitCode=0 Apr 24 22:30:59.903540 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.903208 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerDied","Data":"8b68caa0c075cb2103fc7800f6204fc481b9c55a75ce8c1bed3e6a043b74819f"} Apr 24 22:30:59.918835 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.918817 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:30:59.940730 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.940598 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"fc3d0859-620d-4704-bbab-cc63fceda526\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " Apr 24 22:30:59.940730 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.940659 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3d0859-620d-4704-bbab-cc63fceda526-kserve-provision-location\") pod \"fc3d0859-620d-4704-bbab-cc63fceda526\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " Apr 24 22:30:59.940730 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.940698 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3d0859-620d-4704-bbab-cc63fceda526-proxy-tls\") pod \"fc3d0859-620d-4704-bbab-cc63fceda526\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " Apr 24 22:30:59.940730 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.940743 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfhcz\" (UniqueName: \"kubernetes.io/projected/fc3d0859-620d-4704-bbab-cc63fceda526-kube-api-access-mfhcz\") pod \"fc3d0859-620d-4704-bbab-cc63fceda526\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " Apr 24 22:30:59.941053 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.940815 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-cabundle-cert\") pod \"fc3d0859-620d-4704-bbab-cc63fceda526\" (UID: \"fc3d0859-620d-4704-bbab-cc63fceda526\") " Apr 24 22:30:59.941110 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.941046 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3d0859-620d-4704-bbab-cc63fceda526-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fc3d0859-620d-4704-bbab-cc63fceda526" (UID: "fc3d0859-620d-4704-bbab-cc63fceda526"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:30:59.941394 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.941360 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fc3d0859-620d-4704-bbab-cc63fceda526" (UID: "fc3d0859-620d-4704-bbab-cc63fceda526"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:30:59.941904 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.941872 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "fc3d0859-620d-4704-bbab-cc63fceda526" (UID: "fc3d0859-620d-4704-bbab-cc63fceda526"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:30:59.943627 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.943600 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3d0859-620d-4704-bbab-cc63fceda526-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fc3d0859-620d-4704-bbab-cc63fceda526" (UID: "fc3d0859-620d-4704-bbab-cc63fceda526"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:30:59.943627 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:30:59.943609 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3d0859-620d-4704-bbab-cc63fceda526-kube-api-access-mfhcz" (OuterVolumeSpecName: "kube-api-access-mfhcz") pod "fc3d0859-620d-4704-bbab-cc63fceda526" (UID: "fc3d0859-620d-4704-bbab-cc63fceda526"). InnerVolumeSpecName "kube-api-access-mfhcz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:00.041385 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.041347 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-cabundle-cert\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:00.041385 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.041384 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc3d0859-620d-4704-bbab-cc63fceda526-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:00.041600 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.041402 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc3d0859-620d-4704-bbab-cc63fceda526-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:00.041600 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.041413 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc3d0859-620d-4704-bbab-cc63fceda526-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:00.041600 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.041422 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfhcz\" (UniqueName: \"kubernetes.io/projected/fc3d0859-620d-4704-bbab-cc63fceda526-kube-api-access-mfhcz\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:00.907793 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.907758 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" event={"ID":"fc3d0859-620d-4704-bbab-cc63fceda526","Type":"ContainerDied","Data":"3bf10cc3a7a3307830dbd3fd9d87b7a33af25696cd93995a83091af2bec15bd0"} Apr 24 22:31:00.907793 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.907801 2571 scope.go:117] "RemoveContainer" containerID="71f94b7dc684acd6047da5a4db5ee9ae16fd20427c66cb055031723fea221632" Apr 24 22:31:00.908343 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.907812 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px" Apr 24 22:31:00.920762 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.920477 2571 scope.go:117] "RemoveContainer" containerID="8b68caa0c075cb2103fc7800f6204fc481b9c55a75ce8c1bed3e6a043b74819f" Apr 24 22:31:00.928008 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.927988 2571 scope.go:117] "RemoveContainer" containerID="f785af3b59a5a031af858f43cf60f9a3866a63dd775ebe0a51637210ba2dc15f" Apr 24 22:31:00.945396 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.945374 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px"] Apr 24 22:31:00.947247 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:00.947219 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-654d8c6f8b-677px"] Apr 24 22:31:01.648049 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:01.648013 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" path="/var/lib/kubelet/pods/fc3d0859-620d-4704-bbab-cc63fceda526/volumes" Apr 24 22:31:04.920508 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:04.920478 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2_2c2daf79-da69-4ce0-b95c-85356e6b3877/storage-initializer/0.log" Apr 24 22:31:04.920893 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:04.920517 2571 generic.go:358] "Generic (PLEG): container finished" podID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerID="fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c" exitCode=1 Apr 24 22:31:04.920893 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:04.920543 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" event={"ID":"2c2daf79-da69-4ce0-b95c-85356e6b3877","Type":"ContainerDied","Data":"fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c"} Apr 24 22:31:05.924668 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:05.924640 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2_2c2daf79-da69-4ce0-b95c-85356e6b3877/storage-initializer/0.log" Apr 24 22:31:05.925042 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:05.924735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" event={"ID":"2c2daf79-da69-4ce0-b95c-85356e6b3877","Type":"ContainerStarted","Data":"bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8"} Apr 24 22:31:06.745920 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:06.745885 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2"] Apr 24 22:31:06.927663 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:06.927621 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerName="storage-initializer" containerID="cri-o://bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8" gracePeriod=30 Apr 24 22:31:11.264758 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.264737 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2_2c2daf79-da69-4ce0-b95c-85356e6b3877/storage-initializer/1.log" Apr 24 22:31:11.265147 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.265131 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2_2c2daf79-da69-4ce0-b95c-85356e6b3877/storage-initializer/0.log" Apr 24 22:31:11.265206 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.265196 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:31:11.333056 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.332973 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffgn9\" (UniqueName: \"kubernetes.io/projected/2c2daf79-da69-4ce0-b95c-85356e6b3877-kube-api-access-ffgn9\") pod \"2c2daf79-da69-4ce0-b95c-85356e6b3877\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " Apr 24 22:31:11.333056 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.333017 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c2daf79-da69-4ce0-b95c-85356e6b3877-kserve-provision-location\") pod \"2c2daf79-da69-4ce0-b95c-85356e6b3877\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " Apr 24 22:31:11.333325 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.333086 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls\") pod \"2c2daf79-da69-4ce0-b95c-85356e6b3877\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " Apr 24 22:31:11.333325 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.333117 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c2daf79-da69-4ce0-b95c-85356e6b3877-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"2c2daf79-da69-4ce0-b95c-85356e6b3877\" (UID: \"2c2daf79-da69-4ce0-b95c-85356e6b3877\") " Apr 24 22:31:11.333437 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.333382 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2daf79-da69-4ce0-b95c-85356e6b3877-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2c2daf79-da69-4ce0-b95c-85356e6b3877" (UID: "2c2daf79-da69-4ce0-b95c-85356e6b3877"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:31:11.333534 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.333513 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2daf79-da69-4ce0-b95c-85356e6b3877-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "2c2daf79-da69-4ce0-b95c-85356e6b3877" (UID: "2c2daf79-da69-4ce0-b95c-85356e6b3877"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:11.335250 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.335232 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2c2daf79-da69-4ce0-b95c-85356e6b3877" (UID: "2c2daf79-da69-4ce0-b95c-85356e6b3877"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:11.335326 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.335287 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2daf79-da69-4ce0-b95c-85356e6b3877-kube-api-access-ffgn9" (OuterVolumeSpecName: "kube-api-access-ffgn9") pod "2c2daf79-da69-4ce0-b95c-85356e6b3877" (UID: "2c2daf79-da69-4ce0-b95c-85356e6b3877"). InnerVolumeSpecName "kube-api-access-ffgn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:11.433774 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.433739 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c2daf79-da69-4ce0-b95c-85356e6b3877-proxy-tls\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:11.433774 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.433770 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2c2daf79-da69-4ce0-b95c-85356e6b3877-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:11.433774 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.433782 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ffgn9\" (UniqueName: \"kubernetes.io/projected/2c2daf79-da69-4ce0-b95c-85356e6b3877-kube-api-access-ffgn9\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:11.434068 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.433791 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c2daf79-da69-4ce0-b95c-85356e6b3877-kserve-provision-location\") on node \"ip-10-0-139-5.ec2.internal\" DevicePath \"\"" Apr 24 22:31:11.941580 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.941492 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2_2c2daf79-da69-4ce0-b95c-85356e6b3877/storage-initializer/1.log" Apr 24 22:31:11.941844 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.941827 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2_2c2daf79-da69-4ce0-b95c-85356e6b3877/storage-initializer/0.log" Apr 24 22:31:11.941899 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.941865 2571 generic.go:358] "Generic (PLEG): container finished" podID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerID="bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8" exitCode=1 Apr 24 22:31:11.941899 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.941893 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" event={"ID":"2c2daf79-da69-4ce0-b95c-85356e6b3877","Type":"ContainerDied","Data":"bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8"} Apr 24 22:31:11.941983 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.941915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" event={"ID":"2c2daf79-da69-4ce0-b95c-85356e6b3877","Type":"ContainerDied","Data":"f370bc48f35e6e4a11026a30afced6f824d666c10971d9af5778b0afa2298d99"} Apr 24 22:31:11.941983 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.941930 2571 scope.go:117] "RemoveContainer" containerID="bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8" Apr 24 22:31:11.941983 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.941942 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2" Apr 24 22:31:11.949478 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.949456 2571 scope.go:117] "RemoveContainer" containerID="fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c" Apr 24 22:31:11.955867 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.955850 2571 scope.go:117] "RemoveContainer" containerID="bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8" Apr 24 22:31:11.956097 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:31:11.956079 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8\": container with ID starting with bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8 not found: ID does not exist" containerID="bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8" Apr 24 22:31:11.956170 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.956110 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8"} err="failed to get container status \"bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8\": rpc error: code = NotFound desc = could not find container \"bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8\": container with ID starting with bc80f61f788080f452c1ecbb144f1b3ddb36f801f2736b069027bb988efcd2d8 not found: ID does not exist" Apr 24 22:31:11.956170 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.956135 2571 scope.go:117] "RemoveContainer" containerID="fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c" Apr 24 22:31:11.956407 ip-10-0-139-5 kubenswrapper[2571]: E0424 22:31:11.956386 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c\": container with ID starting with fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c not found: ID does not exist" containerID="fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c" Apr 24 22:31:11.956453 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.956413 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c"} err="failed to get container status \"fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c\": rpc error: code = NotFound desc = could not find container \"fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c\": container with ID starting with fff0c8bba9a43906558d6ea85e6820912728a098e938ab310a25d0e26979075c not found: ID does not exist" Apr 24 22:31:11.977027 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.977004 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2"] Apr 24 22:31:11.980724 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:11.980703 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-86b6454c68-f66l2"] Apr 24 22:31:13.647500 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:13.647467 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" path="/var/lib/kubelet/pods/2c2daf79-da69-4ce0-b95c-85356e6b3877/volumes" Apr 24 22:31:35.498960 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.498920 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j4vvx/must-gather-zb6wj"] Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499178 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499188 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499196 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499201 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499215 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499220 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499228 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kube-rbac-proxy" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499233 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kube-rbac-proxy" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499240 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499245 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499289 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499312 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kserve-container" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499319 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c2daf79-da69-4ce0-b95c-85356e6b3877" containerName="storage-initializer" Apr 24 22:31:35.499382 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.499325 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc3d0859-620d-4704-bbab-cc63fceda526" containerName="kube-rbac-proxy" Apr 24 22:31:35.502102 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.502087 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.505020 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.505000 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4vvx\"/\"kube-root-ca.crt\"" Apr 24 22:31:35.505020 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.505014 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4vvx\"/\"openshift-service-ca.crt\"" Apr 24 22:31:35.506268 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.506253 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-j4vvx\"/\"default-dockercfg-c2k9q\"" Apr 24 22:31:35.511551 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.511529 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/must-gather-zb6wj"] Apr 24 22:31:35.625154 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.625118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17cb2b83-3930-4129-8e8c-a9cf1348c857-must-gather-output\") pod \"must-gather-zb6wj\" (UID: \"17cb2b83-3930-4129-8e8c-a9cf1348c857\") " pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.625363 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.625165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqqf\" (UniqueName: \"kubernetes.io/projected/17cb2b83-3930-4129-8e8c-a9cf1348c857-kube-api-access-xlqqf\") pod \"must-gather-zb6wj\" (UID: \"17cb2b83-3930-4129-8e8c-a9cf1348c857\") " pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.725767 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.725728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17cb2b83-3930-4129-8e8c-a9cf1348c857-must-gather-output\") pod \"must-gather-zb6wj\" (UID: \"17cb2b83-3930-4129-8e8c-a9cf1348c857\") " pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.725969 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.725782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqqf\" (UniqueName: \"kubernetes.io/projected/17cb2b83-3930-4129-8e8c-a9cf1348c857-kube-api-access-xlqqf\") pod \"must-gather-zb6wj\" (UID: \"17cb2b83-3930-4129-8e8c-a9cf1348c857\") " pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.726127 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.726107 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17cb2b83-3930-4129-8e8c-a9cf1348c857-must-gather-output\") pod \"must-gather-zb6wj\" (UID: \"17cb2b83-3930-4129-8e8c-a9cf1348c857\") " pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.735044 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.735021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqqf\" (UniqueName: \"kubernetes.io/projected/17cb2b83-3930-4129-8e8c-a9cf1348c857-kube-api-access-xlqqf\") pod \"must-gather-zb6wj\" (UID: \"17cb2b83-3930-4129-8e8c-a9cf1348c857\") " pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.811282 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.811196 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/must-gather-zb6wj" Apr 24 22:31:35.933283 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:35.933247 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/must-gather-zb6wj"] Apr 24 22:31:35.936572 ip-10-0-139-5 kubenswrapper[2571]: W0424 22:31:35.936540 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17cb2b83_3930_4129_8e8c_a9cf1348c857.slice/crio-16d3f49ae98ce43c9b7ef35e3d400b5611d3f79543fd70e211575d10dee844c7 WatchSource:0}: Error finding container 16d3f49ae98ce43c9b7ef35e3d400b5611d3f79543fd70e211575d10dee844c7: Status 404 returned error can't find the container with id 16d3f49ae98ce43c9b7ef35e3d400b5611d3f79543fd70e211575d10dee844c7 Apr 24 22:31:36.008547 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:36.008514 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/must-gather-zb6wj" event={"ID":"17cb2b83-3930-4129-8e8c-a9cf1348c857","Type":"ContainerStarted","Data":"16d3f49ae98ce43c9b7ef35e3d400b5611d3f79543fd70e211575d10dee844c7"} Apr 24 22:31:37.014063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:37.013323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/must-gather-zb6wj" event={"ID":"17cb2b83-3930-4129-8e8c-a9cf1348c857","Type":"ContainerStarted","Data":"2efa9641e7cb246523623a0dbaed86ee86902bff885bc6cddb5f5db63a86f982"} Apr 24 22:31:37.014063 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:37.013367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/must-gather-zb6wj" event={"ID":"17cb2b83-3930-4129-8e8c-a9cf1348c857","Type":"ContainerStarted","Data":"1f8b3ab4bae6976465cb290bc6fc3f69c2e2d3707aeec588d58ce244ff62ebe6"} Apr 24 22:31:37.029810 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:37.029721 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j4vvx/must-gather-zb6wj" podStartSLOduration=1.2896967830000001 podStartE2EDuration="2.029700592s" podCreationTimestamp="2026-04-24 22:31:35 +0000 UTC" firstStartedPulling="2026-04-24 22:31:35.938222044 +0000 UTC m=+3896.786266139" lastFinishedPulling="2026-04-24 22:31:36.678225837 +0000 UTC m=+3897.526269948" observedRunningTime="2026-04-24 22:31:37.028793472 +0000 UTC m=+3897.876837569" watchObservedRunningTime="2026-04-24 22:31:37.029700592 +0000 UTC m=+3897.877744708" Apr 24 22:31:38.230542 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:38.230514 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7dkks_2405d410-d9d7-4e59-96b9-c1697c0b1258/global-pull-secret-syncer/0.log" Apr 24 22:31:38.552251 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:38.552216 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xnxmv_205a6f27-fa4b-46d4-bf0b-d91aa8cf134c/konnectivity-agent/0.log" Apr 24 22:31:38.636222 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:38.636181 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-5.ec2.internal_fb850784b78b1ad1b82871c81c9b1b43/haproxy/0.log" Apr 24 22:31:42.382981 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:42.382951 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wgtgk_0066d6d0-157c-4bef-89cb-0323a329a6a2/node-exporter/0.log" Apr 24 22:31:42.402775 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:42.402750 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wgtgk_0066d6d0-157c-4bef-89cb-0323a329a6a2/kube-rbac-proxy/0.log" Apr 24 22:31:42.433170 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:42.433146 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wgtgk_0066d6d0-157c-4bef-89cb-0323a329a6a2/init-textfile/0.log" Apr 24 22:31:42.798316 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:42.798270 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-776974b45c-vbjbd_2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02/telemeter-client/0.log" Apr 24 22:31:42.818515 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:42.818476 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-776974b45c-vbjbd_2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02/reload/0.log" Apr 24 22:31:42.839227 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:42.839201 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-776974b45c-vbjbd_2dd49d6e-f91b-4ecc-a1e8-d743b25c4a02/kube-rbac-proxy/0.log" Apr 24 22:31:45.436376 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.436325 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s"] Apr 24 22:31:45.440990 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.440964 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.447652 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.447623 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s"] Apr 24 22:31:45.614538 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.614502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-sys\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.614701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.614559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc57g\" (UniqueName: \"kubernetes.io/projected/7da84e98-ffb4-4954-9e47-469131d10b9c-kube-api-access-bc57g\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.614701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.614640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-proc\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.614701 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.614674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-podres\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.614810 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.614754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-lib-modules\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.715993 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.715901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-lib-modules\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.716219 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.716190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-lib-modules\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.716412 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.716263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-sys\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.716567 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.716546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc57g\" (UniqueName: \"kubernetes.io/projected/7da84e98-ffb4-4954-9e47-469131d10b9c-kube-api-access-bc57g\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.717010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.716348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-sys\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.717010 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.716986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-proc\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.717136 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.717022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-podres\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.717136 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.717038 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-proc\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.717136 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.717129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7da84e98-ffb4-4954-9e47-469131d10b9c-podres\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.725769 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.725749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc57g\" (UniqueName: \"kubernetes.io/projected/7da84e98-ffb4-4954-9e47-469131d10b9c-kube-api-access-bc57g\") pod \"perf-node-gather-daemonset-n857s\" (UID: \"7da84e98-ffb4-4954-9e47-469131d10b9c\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.752488 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.752463 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:45.893950 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.893848 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s"] Apr 24 22:31:45.974068 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.974001 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hmts4_07d586c6-47b1-4dc6-96e0-7dac12734909/dns/0.log" Apr 24 22:31:45.993629 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:45.993607 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hmts4_07d586c6-47b1-4dc6-96e0-7dac12734909/kube-rbac-proxy/0.log" Apr 24 22:31:46.052152 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:46.052123 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" event={"ID":"7da84e98-ffb4-4954-9e47-469131d10b9c","Type":"ContainerStarted","Data":"bd813fb5567e755b99b48799af9b16ffb9873288f3a4d6cdd4e0ff5af1e8a336"} Apr 24 22:31:46.130226 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:46.130200 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gh9lb_95bfde24-898e-4ab6-9414-d93c895b9ba6/dns-node-resolver/0.log" Apr 24 22:31:46.595624 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:46.595565 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6bff679f46-648dd_50557aa9-2f82-4792-88a0-6dca21949f46/registry/0.log" Apr 24 22:31:46.668927 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:46.668898 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ws882_ee2e975b-8948-45ed-9de6-345f4c54c29e/node-ca/0.log" Apr 24 22:31:47.057133 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:47.057088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" event={"ID":"7da84e98-ffb4-4954-9e47-469131d10b9c","Type":"ContainerStarted","Data":"b4885298d0de116c0905c67bf3f795e6de8e01a2da434b353755d04b48bfc1d7"} Apr 24 22:31:47.057358 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:47.057244 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:47.077491 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:47.077432 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" podStartSLOduration=2.077413232 podStartE2EDuration="2.077413232s" podCreationTimestamp="2026-04-24 22:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:47.074764224 +0000 UTC m=+3907.922808342" watchObservedRunningTime="2026-04-24 22:31:47.077413232 +0000 UTC m=+3907.925457351" Apr 24 22:31:47.707189 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:47.707150 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pjmg6_3d01f7a9-76ee-487e-9801-6c420df8721a/serve-healthcheck-canary/0.log" Apr 24 22:31:48.061903 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:48.061865 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8kgzf_c334feeb-a7fc-4d45-ad79-e67520d0cd94/kube-rbac-proxy/0.log" Apr 24 22:31:48.081557 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:48.081530 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8kgzf_c334feeb-a7fc-4d45-ad79-e67520d0cd94/exporter/0.log" Apr 24 22:31:48.102332 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:48.102310 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8kgzf_c334feeb-a7fc-4d45-ad79-e67520d0cd94/extractor/0.log" Apr 24 22:31:50.202171 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:50.202139 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-hqssg_cc602714-ec8b-41a5-b0c0-e3c463957643/server/0.log" Apr 24 22:31:50.455744 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:50.455663 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-hg7rz_0b6eb7d7-ac63-4eac-9e1c-9fe2ec80fa9e/manager/0.log" Apr 24 22:31:50.545009 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:50.544973 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-wm5mt_9399f7cb-5834-4797-891d-20468636cd00/seaweedfs/0.log" Apr 24 22:31:50.568671 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:50.568642 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-wcllw_ffd29e2e-d014-48a5-bdfd-c85537d5fe45/seaweedfs-tls-custom/0.log" Apr 24 22:31:50.589770 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:50.589748 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-sgxlb_cb1f1451-5b8c-4590-9836-205c97925933/seaweedfs-tls-serving/0.log" Apr 24 22:31:53.069821 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:53.069789 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-n857s" Apr 24 22:31:55.959124 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:55.959076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gkkz_7f06c4fe-10e6-4600-864c-07dad67ed49f/kube-multus/0.log" Apr 24 22:31:56.341592 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.341566 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phlsx_9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b/kube-multus-additional-cni-plugins/0.log" Apr 24 22:31:56.368459 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.368433 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phlsx_9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b/egress-router-binary-copy/0.log" Apr 24 22:31:56.389789 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.389757 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phlsx_9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b/cni-plugins/0.log" Apr 24 22:31:56.411813 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.411782 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phlsx_9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b/bond-cni-plugin/0.log" Apr 24 22:31:56.435262 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.435238 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phlsx_9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b/routeoverride-cni/0.log" Apr 24 22:31:56.459946 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.459921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phlsx_9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b/whereabouts-cni-bincopy/0.log" Apr 24 22:31:56.494994 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.494966 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phlsx_9f26a9e4-a21b-4fd3-a28f-f1eef2985f9b/whereabouts-cni/0.log" Apr 24 22:31:56.644378 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.644289 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-npqvg_6db469f2-5afc-41c5-8338-9558deee2bd6/network-metrics-daemon/0.log" Apr 24 22:31:56.666318 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:56.666269 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-npqvg_6db469f2-5afc-41c5-8338-9558deee2bd6/kube-rbac-proxy/0.log" Apr 24 22:31:58.150395 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.150365 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/ovn-controller/0.log" Apr 24 22:31:58.198112 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.198086 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/ovn-acl-logging/0.log" Apr 24 22:31:58.228509 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.228439 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/kube-rbac-proxy-node/0.log" Apr 24 22:31:58.258856 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.258835 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:31:58.284245 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.284221 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/northd/0.log" Apr 24 22:31:58.310751 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.310729 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/nbdb/0.log" Apr 24 22:31:58.335364 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.335326 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/sbdb/0.log" Apr 24 22:31:58.464992 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:58.464918 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k487s_98cd2f7a-2a8b-46c5-8a0b-35d6ebb920c6/ovnkube-controller/0.log" Apr 24 22:31:59.548157 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:31:59.548124 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-77b2w_9374c699-094f-4c29-9406-afbd076c9722/network-check-target-container/0.log" Apr 24 22:32:00.472864 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:32:00.472836 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7wctt_1d60af74-5e4a-4c56-8738-0ea78867d785/iptables-alerter/0.log" Apr 24 22:32:01.158599 ip-10-0-139-5 kubenswrapper[2571]: I0424 22:32:01.158569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jk496_1e7352a7-e690-4558-a1e5-876926d3a57f/tuned/0.log"