Apr 21 03:54:41.043175 ip-10-0-134-15 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 03:54:41.043188 ip-10-0-134-15 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 03:54:41.043197 ip-10-0-134-15 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 03:54:41.043466 ip-10-0-134-15 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 03:54:51.214090 ip-10-0-134-15 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 03:54:51.214105 ip-10-0-134-15 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 67995beb36ed4c118d8b59a2cc8de7ab -- Apr 21 03:56:54.202998 ip-10-0-134-15 systemd[1]: Starting Kubernetes Kubelet... Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.709141 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716478 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716489 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716493 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716496 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716500 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716503 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716506 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:54.728401 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716508 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716511 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716514 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716516 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716520 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716522 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716525 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716534 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716538 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716540 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716543 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716546 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716549 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716552 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716554 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716557 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716560 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716563 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716565 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716568 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:54.729877 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716571 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716573 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716576 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716579 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716581 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716584 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716588 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716593 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716595 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716598 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716601 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716604 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716607 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716609 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716612 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716614 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716617 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716620 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716622 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:54.730440 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716625 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716628 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716631 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716635 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716637 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716640 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716642 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716645 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716647 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716650 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716652 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716655 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716657 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716661 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716664 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716666 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716669 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716671 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716674 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716677 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:54.731188 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716679 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716682 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716685 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716688 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716691 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716694 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716696 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716699 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716701 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716704 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716706 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716709 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716711 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716714 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716718 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716720 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716723 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716725 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716728 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:54.731752 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.716732 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717123 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717129 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717131 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717134 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717137 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717140 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717143 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717147 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717149 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717152 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717154 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717157 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717160 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717163 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717171 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717174 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717178 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717182 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:54.732457 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717185 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717187 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717190 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717193 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717196 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717199 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717201 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717204 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717207 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717210 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717212 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717215 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717218 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717220 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717223 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717225 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717228 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717230 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717234 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717236 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:54.733015 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717239 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717242 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717247 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717249 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717252 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717254 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717257 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717259 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717262 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717265 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717268 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717270 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717273 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717275 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717278 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717280 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717283 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717286 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717289 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717292 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:54.733819 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717295 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717298 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717300 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717303 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717305 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717308 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717310 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717313 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717315 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717317 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717320 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717323 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717327 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717329 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717333 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717336 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717338 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717342 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717344 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717346 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:54.734398 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717349 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717352 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717355 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717357 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717360 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717362 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717365 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.717367 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719041 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719051 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719071 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719077 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719082 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719085 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719090 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719095 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719098 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719101 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719105 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719108 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719111 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719114 2575 flags.go:64] FLAG: --cgroup-root="" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719117 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 03:56:54.735260 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719120 2575 flags.go:64] FLAG: --client-ca-file="" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719123 2575 flags.go:64] FLAG: --cloud-config="" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719125 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719131 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719136 2575 flags.go:64] FLAG: --cluster-domain="" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719139 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719142 2575 flags.go:64] FLAG: --config-dir="" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719145 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719148 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719152 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719155 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719158 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719162 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719166 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719169 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719172 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719175 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719178 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719182 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719185 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719188 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719191 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719194 2575 flags.go:64] FLAG: --enable-server="true" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719197 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719201 2575 flags.go:64] FLAG: --event-burst="100" Apr 21 03:56:54.914932 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719204 2575 flags.go:64] FLAG: --event-qps="50" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719207 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719210 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719213 2575 flags.go:64] FLAG: --eviction-hard="" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719217 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719220 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719223 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719226 2575 flags.go:64] FLAG: --eviction-soft="" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719228 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719232 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719236 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719240 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719242 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719245 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719248 2575 flags.go:64] FLAG: --feature-gates="" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719252 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719255 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719258 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719261 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719265 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719268 2575 flags.go:64] FLAG: --help="false" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719271 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-134-15.ec2.internal" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719275 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719277 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 03:56:54.917613 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719280 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 03:56:54.829827 ip-10-0-134-15 systemd[1]: Started Kubernetes Kubelet. Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719284 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719287 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719290 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719293 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719296 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719299 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719302 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719305 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719308 2575 flags.go:64] FLAG: --kube-reserved="" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719310 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719313 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719316 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719319 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719322 2575 flags.go:64] FLAG: --lock-file="" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719325 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719327 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719330 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719337 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719340 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719343 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719345 2575 flags.go:64] FLAG: --logging-format="text" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719348 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719352 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 03:56:54.918874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719355 2575 flags.go:64] FLAG: --manifest-url="" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719358 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719362 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719365 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719370 2575 flags.go:64] FLAG: --max-pods="110" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719373 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719377 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719380 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719383 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719386 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719389 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719392 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719399 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719402 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719406 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719409 2575 flags.go:64] FLAG: --pod-cidr="" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719412 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719417 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719419 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719423 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719425 2575 flags.go:64] FLAG: --port="10250" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719428 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719431 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04970c4488cf83e34" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719435 2575 flags.go:64] FLAG: --qos-reserved="" Apr 21 03:56:54.919809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719438 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719440 2575 flags.go:64] FLAG: --register-node="true" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719445 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719448 2575 flags.go:64] FLAG: --register-with-taints="" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719451 2575 flags.go:64] FLAG: --registry-burst="10" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719454 2575 flags.go:64] FLAG: --registry-qps="5" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719457 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719460 2575 flags.go:64] FLAG: --reserved-memory="" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719464 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719467 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719470 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719472 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719476 2575 flags.go:64] FLAG: --runonce="false" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719479 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719482 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719485 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719489 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719491 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719495 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719498 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719501 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719504 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719506 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719509 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719512 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719515 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 03:56:54.920535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719518 2575 flags.go:64] FLAG: --system-cgroups="" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719520 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719529 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719532 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719534 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719539 2575 flags.go:64] FLAG: --tls-min-version="" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719542 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719545 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719549 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719552 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719556 2575 flags.go:64] FLAG: --v="2" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719561 2575 flags.go:64] FLAG: --version="false" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719565 2575 flags.go:64] FLAG: --vmodule="" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719569 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719572 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719666 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719671 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719674 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719677 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719680 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719683 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719686 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719689 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:54.921512 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719692 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719695 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719697 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719700 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719702 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719705 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719707 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719710 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719713 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719715 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719718 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719720 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719723 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719725 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719728 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719731 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719733 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719737 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719740 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719744 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:54.922275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719747 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719750 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719752 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719768 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719771 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719774 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719776 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719779 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719782 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719784 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719787 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719790 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719793 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719796 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719799 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719801 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719804 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719807 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719809 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:54.923197 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719811 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719815 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719817 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719820 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719823 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719827 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719830 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719833 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719836 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719838 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719843 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719845 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719850 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719852 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719855 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719858 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719860 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719863 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719865 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:54.923840 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719868 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719870 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719873 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719876 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719878 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719881 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719885 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719887 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719890 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719894 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719897 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719900 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719902 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719905 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719907 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719910 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719912 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719915 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719917 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:54.924505 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.719920 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.719926 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.731996 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.732013 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732063 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732068 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732072 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732076 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732079 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732082 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732085 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732087 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732091 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732095 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732097 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732100 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:54.925366 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732102 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732105 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732108 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732110 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732113 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732115 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732118 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732120 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732123 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732126 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732128 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732130 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732133 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732136 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732139 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732142 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732145 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732147 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732150 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732152 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:54.925920 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732156 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732158 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732161 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732163 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732166 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732169 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732172 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732175 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732177 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732181 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732184 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732186 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732189 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732191 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732194 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732197 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732199 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732202 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732204 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:54.926525 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732207 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732210 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732212 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732215 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732218 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732221 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732224 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732226 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732229 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732232 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732234 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732237 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732239 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732242 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732245 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732248 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732250 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732253 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732255 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732258 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:55.135478 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732261 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732265 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732269 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732272 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732275 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732278 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732280 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732283 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732286 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732288 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732290 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732293 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732295 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732298 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732300 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:55.136148 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.732306 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732399 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732404 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732407 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732410 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732413 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732416 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732419 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732422 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732424 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732427 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732435 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732438 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732441 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732443 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732446 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732448 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732451 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732454 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732456 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:55.136817 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732459 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732462 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732464 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732468 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732471 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732474 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732476 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732479 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732482 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732484 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732487 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732489 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732491 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732494 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732496 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732503 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732505 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732508 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732510 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732513 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:55.137347 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732516 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732519 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732521 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732525 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732527 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732530 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732532 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732535 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732537 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732540 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732542 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732545 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732547 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732550 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732552 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732555 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732559 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732562 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732564 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732566 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:55.360275 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732569 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732572 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732574 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732577 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732579 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732582 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732584 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732588 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732591 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732593 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732597 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732601 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732604 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732607 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732610 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732614 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732616 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732619 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732621 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:55.361025 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732624 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732626 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732629 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732631 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732634 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732637 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732639 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:54.732643 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.732648 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.733470 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.735797 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.736796 2575 server.go:1019] "Starting client certificate rotation" Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.736893 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:56:55.362538 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.736930 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.766687 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.769557 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.786143 2575 log.go:25] "Validated CRI v1 runtime API" Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.795494 2575 log.go:25] "Validated CRI v1 image API" Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.796914 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.797325 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.802080 2575 fs.go:135] Filesystem UUIDs: map[341342a2-e544-4b83-afb9-e7fc7598ef29:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a4e00004-2084-4d60-89fd-aec5e5ffdf40:/dev/nvme0n1p4] Apr 21 03:56:55.362944 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.802097 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.807531 2575 manager.go:217] Machine: {Timestamp:2026-04-21 03:56:54.805623193 +0000 UTC m=+0.470213270 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099061 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec266b2d7c580de9c551d1482fd19236 SystemUUID:ec266b2d-7c58-0de9-c551-d1482fd19236 BootID:67995beb-36ed-4c11-8d8b-59a2cc8de7ab Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:61:77:6b:a0:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:61:77:6b:a0:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:ef:7b:34:fc:21 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.807629 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.807703 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.808573 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.808593 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-15.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.808733 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.808741 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.808768 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.809563 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.811065 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.811163 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.813625 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.813638 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.813650 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.813658 2575 kubelet.go:397] "Adding apiserver pod source" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.813666 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.815889 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:56:55.363331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.815906 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.819400 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.820943 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822453 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822468 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822474 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822479 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822484 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822491 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822497 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822502 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822509 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822515 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822523 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.822906 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.823772 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.823780 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.829078 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.829104 2575 server.go:1295] "Started kubelet" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.829536 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.829660 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.829726 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.830922 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-15.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.830935 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-15.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.830928 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.831130 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.835160 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.840039 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-15.ec2.internal.18a8431603c295f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-15.ec2.internal,UID:ip-10-0-134-15.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-15.ec2.internal,},FirstTimestamp:2026-04-21 03:56:54.829086194 +0000 UTC m=+0.493676271,LastTimestamp:2026-04-21 03:56:54.829086194 +0000 UTC m=+0.493676271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-15.ec2.internal,}" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.841176 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r79kj" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.842282 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.842847 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.843589 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.843655 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.843669 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.843780 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.843787 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.844012 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847411 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847427 2575 factory.go:55] Registering systemd factory Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847437 2575 factory.go:223] Registration of the systemd container factory successfully Apr 21 03:56:55.363864 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.847516 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847658 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r79kj" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847729 2575 factory.go:153] Registering CRI-O factory Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847746 2575 factory.go:223] Registration of the crio container factory successfully Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847796 2575 factory.go:103] Registering Raw factory Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.847824 2575 manager.go:1196] Started watching for new ooms in manager Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.848251 2575 manager.go:319] Starting recovery of all containers Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.853480 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.855959 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-15.ec2.internal\" not found" node="ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.858018 2575 manager.go:324] Recovery completed Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.861959 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.865180 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.865209 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.865220 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.865651 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.865662 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.865681 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.868313 2575 policy_none.go:49] "None policy: Start" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.868326 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.868336 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.917509 2575 manager.go:341] "Starting Device Plugin manager" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.917553 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.917563 2575 server.go:85] "Starting device plugin registration server" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.917747 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.917777 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.917869 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.917973 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:54.917982 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.918644 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:54.918685 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.018055 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.019006 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.019035 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.019049 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.019072 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.021673 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.022963 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.022984 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.023001 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.023010 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.023073 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.024677 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.030372 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.030393 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-15.ec2.internal\": node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.069494 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.123955 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal"] Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.124027 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.124799 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:55.364946 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.124825 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.124840 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.125926 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126092 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126606 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126628 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126644 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126652 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126668 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.126681 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.127709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.127735 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.128328 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.128351 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.128360 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.145803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71f54f10b0458b533d193f0621e7e37f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal\" (UID: \"71f54f10b0458b533d193f0621e7e37f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.145824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71f54f10b0458b533d193f0621e7e37f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal\" (UID: \"71f54f10b0458b533d193f0621e7e37f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.145842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb235a448aa33328082b3f09d40d9023-config\") pod \"kube-apiserver-proxy-ip-10-0-134-15.ec2.internal\" (UID: \"cb235a448aa33328082b3f09d40d9023\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.158293 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-15.ec2.internal\" not found" node="ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.162574 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-15.ec2.internal\" not found" node="ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.169817 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.246494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71f54f10b0458b533d193f0621e7e37f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal\" (UID: \"71f54f10b0458b533d193f0621e7e37f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.246503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71f54f10b0458b533d193f0621e7e37f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal\" (UID: \"71f54f10b0458b533d193f0621e7e37f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.246548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71f54f10b0458b533d193f0621e7e37f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal\" (UID: \"71f54f10b0458b533d193f0621e7e37f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.246567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb235a448aa33328082b3f09d40d9023-config\") pod \"kube-apiserver-proxy-ip-10-0-134-15.ec2.internal\" (UID: \"cb235a448aa33328082b3f09d40d9023\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.366309 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.246607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb235a448aa33328082b3f09d40d9023-config\") pod \"kube-apiserver-proxy-ip-10-0-134-15.ec2.internal\" (UID: \"cb235a448aa33328082b3f09d40d9023\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.367341 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.246619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71f54f10b0458b533d193f0621e7e37f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal\" (UID: \"71f54f10b0458b533d193f0621e7e37f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.367341 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.270214 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.580149 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.371257 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.580149 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.462046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.580149 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.465403 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" Apr 21 03:56:55.580149 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.471926 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.580149 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.572482 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.672967 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.672938 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.737137 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.737108 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 03:56:55.737792 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.737242 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:56:55.737792 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.737279 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:56:55.773816 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.773795 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.842554 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.842498 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 03:56:55.850043 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.850020 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 03:51:54 +0000 UTC" deadline="2027-10-25 04:34:44.886974375 +0000 UTC" Apr 21 03:56:55.850043 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.850041 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13248h37m49.036936442s" Apr 21 03:56:55.855113 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.855094 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:56:55.874290 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.874271 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:55.875550 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.875534 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xz8g7" Apr 21 03:56:55.883934 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:55.883911 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xz8g7" Apr 21 03:56:55.974950 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:55.974926 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:56.024606 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:56.024575 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb235a448aa33328082b3f09d40d9023.slice/crio-fbf85f7f0f42503532cc27d6974f117f8cf75844e411139a15b1c4be767900c5 WatchSource:0}: Error finding container fbf85f7f0f42503532cc27d6974f117f8cf75844e411139a15b1c4be767900c5: Status 404 returned error can't find the container with id fbf85f7f0f42503532cc27d6974f117f8cf75844e411139a15b1c4be767900c5 Apr 21 03:56:56.025333 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:56.025309 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f54f10b0458b533d193f0621e7e37f.slice/crio-2b8211e1701bcea7770175f3c64f9870d288ecce323cd0e7c919f56814fefdb0 WatchSource:0}: Error finding container 2b8211e1701bcea7770175f3c64f9870d288ecce323cd0e7c919f56814fefdb0: Status 404 returned error can't find the container with id 2b8211e1701bcea7770175f3c64f9870d288ecce323cd0e7c919f56814fefdb0 Apr 21 03:56:56.029890 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.029868 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 03:56:56.075177 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.075023 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-15.ec2.internal\" not found" Apr 21 03:56:56.128031 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.127983 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:56:56.144106 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.144086 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" Apr 21 03:56:56.151686 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.151668 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:56:56.156471 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.156458 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:56:56.157355 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.157344 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" Apr 21 03:56:56.163791 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.163779 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:56:56.814919 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.814890 2575 apiserver.go:52] "Watching apiserver" Apr 21 03:56:56.823269 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.823241 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 03:56:56.824512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.824445 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zgz2x","openshift-multus/network-metrics-daemon-gpfrt","openshift-network-operator/iptables-alerter-rzjbf","kube-system/konnectivity-agent-6w2n8","openshift-dns/node-resolver-j8nd9","openshift-multus/multus-additional-cni-plugins-phc24","openshift-network-diagnostics/network-check-target-9cslm","openshift-ovn-kubernetes/ovnkube-node-kxcbd","kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws","openshift-cluster-node-tuning-operator/tuned-ldkbb","openshift-image-registry/node-ca-tqkpr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal"] Apr 21 03:56:56.829392 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.829370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:56:56.829502 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.829467 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:56:56.831676 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.831505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:56.831676 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.831579 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:56:56.833793 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.833724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.833793 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.833740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:56.836018 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.835996 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.836330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.836313 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dr94c\"" Apr 21 03:56:56.836647 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.836627 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 03:56:56.836647 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.836638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8d2pz\"" Apr 21 03:56:56.836791 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.836650 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 03:56:56.836911 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.836894 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:56:56.836979 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.836909 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 03:56:56.837061 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.837046 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 03:56:56.838579 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.838559 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 03:56:56.838774 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.838737 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 03:56:56.838948 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.838932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8qztf\"" Apr 21 03:56:56.840706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.840685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.843743 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.842919 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rvcl2\"" Apr 21 03:56:56.843743 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.843173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 03:56:56.843743 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.843247 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 03:56:56.843743 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.843252 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.843994 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.843873 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 03:56:56.844173 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.844159 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 03:56:56.844415 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.844398 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 03:56:56.846218 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.845891 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 03:56:56.846218 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.845939 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 03:56:56.846218 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.846099 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lwmrt\"" Apr 21 03:56:56.846218 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.846134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.846570 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.846553 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 03:56:56.847336 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.847217 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 03:56:56.847336 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.847238 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 03:56:56.847656 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.847638 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 03:56:56.848901 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.848882 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 03:56:56.849339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.849320 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6zlb8\"" Apr 21 03:56:56.851881 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.850594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.851881 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.851167 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.853812 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.853788 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vbnx2\"" Apr 21 03:56:56.854487 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.854464 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 03:56:56.854581 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.854544 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:56.854643 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.854625 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 03:56:56.854782 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.854747 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:56:56.855221 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855174 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 03:56:56.855384 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855366 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-765kf\"" Apr 21 03:56:56.855455 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855418 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 03:56:56.855583 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/332698b5-e816-48f5-806f-295e3ed3f8fb-agent-certs\") pod \"konnectivity-agent-6w2n8\" (UID: \"332698b5-e816-48f5-806f-295e3ed3f8fb\") " pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:56.855583 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-daemon-config\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.855672 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14f705da-fc93-4286-8c99-eb46f7420053-host-slash\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.855702 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-kubelet\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.855735 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-etc-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.855793 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-cni-netd\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.855793 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-ovnkube-script-lib\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.855883 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-system-cni-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.855883 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-cni-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.855883 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-multus-certs\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.855883 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-slash\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856043 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7wg5\" (UniqueName: \"kubernetes.io/projected/9acdf950-6bdd-4903-943b-90a6f96b5271-kube-api-access-q7wg5\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856043 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-os-release\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856043 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-hostroot\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856043 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.855995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-socket-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.856043 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-systemd\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/332698b5-e816-48f5-806f-295e3ed3f8fb-konnectivity-ca\") pod \"konnectivity-agent-6w2n8\" (UID: \"332698b5-e816-48f5-806f-295e3ed3f8fb\") " pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/494fdf31-01dd-419f-a5be-8ea679099b8e-cni-binary-copy\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnr8\" (UniqueName: \"kubernetes.io/projected/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-kube-api-access-hpnr8\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwld\" (UniqueName: \"kubernetes.io/projected/89bfd36f-311d-477d-a1ba-cc9a2854b55d-kube-api-access-wlwld\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-cni-bin\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-conf-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvwhk\" (UniqueName: \"kubernetes.io/projected/494fdf31-01dd-419f-a5be-8ea679099b8e-kube-api-access-rvwhk\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856275 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cnibin\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqvg\" (UniqueName: \"kubernetes.io/projected/aa5d6fee-c188-4cee-a9dd-cc90927bef31-kube-api-access-rvqvg\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-kubelet\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-var-lib-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-node-log\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-log-socket\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj9k9\" (UniqueName: \"kubernetes.io/projected/14f705da-fc93-4286-8c99-eb46f7420053-kube-api-access-zj9k9\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/89bfd36f-311d-477d-a1ba-cc9a2854b55d-hosts-file\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.856597 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-device-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-env-overrides\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856695 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-k8s-cni-cncf-io\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-registration-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-etc-kubernetes\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdhk8\" (UniqueName: \"kubernetes.io/projected/5b202830-ba31-44af-9eb0-ac33aeef57d1-kube-api-access-wdhk8\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89bfd36f-311d-477d-a1ba-cc9a2854b55d-tmp-dir\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.856982 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.856979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-ovnkube-config\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.857469 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.857469 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-ovn\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.857469 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14f705da-fc93-4286-8c99-eb46f7420053-iptables-alerter-script\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.857469 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857292 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 03:56:56.857469 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857353 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kkx9s\"" Apr 21 03:56:56.857469 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857371 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 03:56:56.857743 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 03:56:56.857821 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-system-cni-dir\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.857886 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-systemd-units\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.857886 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cni-binary-copy\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.857983 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.857983 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-cnibin\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.857983 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.857961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-cni-multus\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.858124 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.858124 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-netns\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.858124 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-cni-bin\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.858124 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-os-release\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.858124 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-socket-dir-parent\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.858357 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-run-netns\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.858357 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9acdf950-6bdd-4903-943b-90a6f96b5271-ovn-node-metrics-cert\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.858357 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.858357 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.858357 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.858212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-sys-fs\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.884722 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.884693 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:51:55 +0000 UTC" deadline="2028-01-22 10:37:23.70111026 +0000 UTC" Apr 21 03:56:56.884722 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.884721 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15390h40m26.816392395s" Apr 21 03:56:56.938170 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.938144 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:56:56.944727 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.944704 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 03:56:56.959179 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvwhk\" (UniqueName: \"kubernetes.io/projected/494fdf31-01dd-419f-a5be-8ea679099b8e-kube-api-access-rvwhk\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.959269 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysctl-conf\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.959269 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959220 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-systemd\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.959269 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-sys\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.959411 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cnibin\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.959411 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqvg\" (UniqueName: \"kubernetes.io/projected/aa5d6fee-c188-4cee-a9dd-cc90927bef31-kube-api-access-rvqvg\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.959411 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-kubelet\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.959411 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cnibin\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.959411 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-var-lib-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.959411 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-node-log\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-kubelet\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-log-socket\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysconfig\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-node-log\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-var-lib-kubelet\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-var-lib-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-log-socket\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj9k9\" (UniqueName: \"kubernetes.io/projected/14f705da-fc93-4286-8c99-eb46f7420053-kube-api-access-zj9k9\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/89bfd36f-311d-477d-a1ba-cc9a2854b55d-hosts-file\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-device-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.959677 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/89bfd36f-311d-477d-a1ba-cc9a2854b55d-hosts-file\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.960190 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.960190 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.959819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-device-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.960190 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-env-overrides\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.960190 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-k8s-cni-cncf-io\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.960337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-registration-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.960337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-etc-kubernetes\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.960337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdhk8\" (UniqueName: \"kubernetes.io/projected/5b202830-ba31-44af-9eb0-ac33aeef57d1-kube-api-access-wdhk8\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.960337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:56:56.960337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-k8s-cni-cncf-io\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.960337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:56.960337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89bfd36f-311d-477d-a1ba-cc9a2854b55d-tmp-dir\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.960599 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.960425 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:56:56.960599 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.960500 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:56:57.460471556 +0000 UTC m=+3.125061623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:56:56.960685 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.960685 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-ovnkube-config\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.960791 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/021801b3-5414-43aa-8164-e9365f642f91-etc-tuned\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.960791 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.960880 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-ovn\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.960880 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14f705da-fc93-4286-8c99-eb46f7420053-iptables-alerter-script\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.960880 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-system-cni-dir\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.960986 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-systemd-units\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.960986 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysctl-d\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.960986 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cni-binary-copy\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.960986 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.961115 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.960985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-cnibin\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.961115 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-cni-multus\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.961115 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-lib-modules\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.961115 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961064 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzlzn\" (UniqueName: \"kubernetes.io/projected/021801b3-5414-43aa-8164-e9365f642f91-kube-api-access-qzlzn\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.961115 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-registration-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.961115 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961092 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-netns\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-cni-bin\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-os-release\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-socket-dir-parent\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-run-netns\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9acdf950-6bdd-4903-943b-90a6f96b5271-ovn-node-metrics-cert\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89bfd36f-311d-477d-a1ba-cc9a2854b55d-tmp-dir\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.961339 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-sys-fs\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hjpm\" (UniqueName: \"kubernetes.io/projected/48670ed3-db9b-4299-a9b9-270bf4c32561-kube-api-access-5hjpm\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/332698b5-e816-48f5-806f-295e3ed3f8fb-agent-certs\") pod \"konnectivity-agent-6w2n8\" (UID: \"332698b5-e816-48f5-806f-295e3ed3f8fb\") " pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-daemon-config\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961444 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-modprobe-d\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/021801b3-5414-43aa-8164-e9365f642f91-tmp\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14f705da-fc93-4286-8c99-eb46f7420053-host-slash\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-kubelet\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-etc-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-cni-netd\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-ovnkube-script-lib\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-host\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.961706 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-system-cni-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-cni-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-multus-certs\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-slash\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7wg5\" (UniqueName: \"kubernetes.io/projected/9acdf950-6bdd-4903-943b-90a6f96b5271-kube-api-access-q7wg5\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-os-release\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-hostroot\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-socket-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-systemd\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48670ed3-db9b-4299-a9b9-270bf4c32561-host\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-env-overrides\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48670ed3-db9b-4299-a9b9-270bf4c32561-serviceca\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/332698b5-e816-48f5-806f-295e3ed3f8fb-konnectivity-ca\") pod \"konnectivity-agent-6w2n8\" (UID: \"332698b5-e816-48f5-806f-295e3ed3f8fb\") " pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/494fdf31-01dd-419f-a5be-8ea679099b8e-cni-binary-copy\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-cni-bin\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnr8\" (UniqueName: \"kubernetes.io/projected/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-kube-api-access-hpnr8\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-ovnkube-config\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.962330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-system-cni-dir\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14f705da-fc93-4286-8c99-eb46f7420053-iptables-alerter-script\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-os-release\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-socket-dir-parent\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-run-netns\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-os-release\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9acdf950-6bdd-4903-943b-90a6f96b5271-ovnkube-script-lib\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-multus-certs\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962925 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-slash\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-hostroot\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.963099 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-systemd-units\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.963549 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-systemd\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.963549 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-cni-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.963549 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-cnibin\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.963549 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-cni-multus\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.963549 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-socket-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.963775 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.963947 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-run-ovn\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.964011 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-kubelet\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.964011 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.963988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-etc-openvswitch\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.964081 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.964081 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.964081 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b202830-ba31-44af-9eb0-ac33aeef57d1-sys-fs\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.964081 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-cni-binary-copy\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.964214 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa5d6fee-c188-4cee-a9dd-cc90927bef31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.964253 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14f705da-fc93-4286-8c99-eb46f7420053-host-slash\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.964427 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.961097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-etc-kubernetes\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.964512 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/332698b5-e816-48f5-806f-295e3ed3f8fb-konnectivity-ca\") pod \"konnectivity-agent-6w2n8\" (UID: \"332698b5-e816-48f5-806f-295e3ed3f8fb\") " pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:56.964623 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.962140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwld\" (UniqueName: \"kubernetes.io/projected/89bfd36f-311d-477d-a1ba-cc9a2854b55d-kube-api-access-wlwld\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.964703 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa5d6fee-c188-4cee-a9dd-cc90927bef31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.964791 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.964772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9acdf950-6bdd-4903-943b-90a6f96b5271-host-cni-netd\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.965143 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-daemon-config\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.965209 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-system-cni-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.965209 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/494fdf31-01dd-419f-a5be-8ea679099b8e-cni-binary-copy\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.965209 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-run-netns\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.965209 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-kubernetes\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.965401 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965213 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-run\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:56.965401 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-cni-bin\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.965401 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-host-var-lib-cni-bin\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.965401 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-conf-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.965589 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.965448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/494fdf31-01dd-419f-a5be-8ea679099b8e-multus-conf-dir\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.966619 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.966594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9acdf950-6bdd-4903-943b-90a6f96b5271-ovn-node-metrics-cert\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.968183 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.968149 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:56:56.968322 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.968309 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:56:56.968435 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.968424 2575 projected.go:194] Error preparing data for projected volume kube-api-access-n9prw for pod openshift-network-diagnostics/network-check-target-9cslm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:56:56.968632 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:56.968618 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw podName:541c9c2c-7803-469b-9f76-fa3ec6995458 nodeName:}" failed. No retries permitted until 2026-04-21 03:56:57.468575399 +0000 UTC m=+3.133165485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n9prw" (UniqueName: "kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw") pod "network-check-target-9cslm" (UID: "541c9c2c-7803-469b-9f76-fa3ec6995458") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:56:56.968957 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.968927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/332698b5-e816-48f5-806f-295e3ed3f8fb-agent-certs\") pod \"konnectivity-agent-6w2n8\" (UID: \"332698b5-e816-48f5-806f-295e3ed3f8fb\") " pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:56.969500 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.969478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvwhk\" (UniqueName: \"kubernetes.io/projected/494fdf31-01dd-419f-a5be-8ea679099b8e-kube-api-access-rvwhk\") pod \"multus-zgz2x\" (UID: \"494fdf31-01dd-419f-a5be-8ea679099b8e\") " pod="openshift-multus/multus-zgz2x" Apr 21 03:56:56.970137 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.970099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdhk8\" (UniqueName: \"kubernetes.io/projected/5b202830-ba31-44af-9eb0-ac33aeef57d1-kube-api-access-wdhk8\") pod \"aws-ebs-csi-driver-node-hwcws\" (UID: \"5b202830-ba31-44af-9eb0-ac33aeef57d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:56.970453 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.970431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqvg\" (UniqueName: \"kubernetes.io/projected/aa5d6fee-c188-4cee-a9dd-cc90927bef31-kube-api-access-rvqvg\") pod \"multus-additional-cni-plugins-phc24\" (UID: \"aa5d6fee-c188-4cee-a9dd-cc90927bef31\") " pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:56.971097 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.971074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwld\" (UniqueName: \"kubernetes.io/projected/89bfd36f-311d-477d-a1ba-cc9a2854b55d-kube-api-access-wlwld\") pod \"node-resolver-j8nd9\" (UID: \"89bfd36f-311d-477d-a1ba-cc9a2854b55d\") " pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:56.971176 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.971157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj9k9\" (UniqueName: \"kubernetes.io/projected/14f705da-fc93-4286-8c99-eb46f7420053-kube-api-access-zj9k9\") pod \"iptables-alerter-rzjbf\" (UID: \"14f705da-fc93-4286-8c99-eb46f7420053\") " pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:56.975374 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.975357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7wg5\" (UniqueName: \"kubernetes.io/projected/9acdf950-6bdd-4903-943b-90a6f96b5271-kube-api-access-q7wg5\") pod \"ovnkube-node-kxcbd\" (UID: \"9acdf950-6bdd-4903-943b-90a6f96b5271\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:56.976809 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:56.976789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnr8\" (UniqueName: \"kubernetes.io/projected/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-kube-api-access-hpnr8\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:57.026947 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.026867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" event={"ID":"71f54f10b0458b533d193f0621e7e37f","Type":"ContainerStarted","Data":"2b8211e1701bcea7770175f3c64f9870d288ecce323cd0e7c919f56814fefdb0"} Apr 21 03:56:57.027903 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.027879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" event={"ID":"cb235a448aa33328082b3f09d40d9023","Type":"ContainerStarted","Data":"fbf85f7f0f42503532cc27d6974f117f8cf75844e411139a15b1c4be767900c5"} Apr 21 03:56:57.065978 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.065949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysctl-d\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066092 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.065988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-lib-modules\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066092 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzlzn\" (UniqueName: \"kubernetes.io/projected/021801b3-5414-43aa-8164-e9365f642f91-kube-api-access-qzlzn\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066092 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hjpm\" (UniqueName: \"kubernetes.io/projected/48670ed3-db9b-4299-a9b9-270bf4c32561-kube-api-access-5hjpm\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:57.066244 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-lib-modules\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066244 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysctl-d\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066244 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-modprobe-d\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066244 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/021801b3-5414-43aa-8164-e9365f642f91-tmp\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066244 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-host\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48670ed3-db9b-4299-a9b9-270bf4c32561-host\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48670ed3-db9b-4299-a9b9-270bf4c32561-serviceca\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-kubernetes\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-modprobe-d\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-host\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-run\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48670ed3-db9b-4299-a9b9-270bf4c32561-host\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066381 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysctl-conf\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066398 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-run\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-systemd\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-sys\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysconfig\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.066475 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-var-lib-kubelet\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-systemd\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/021801b3-5414-43aa-8164-e9365f642f91-etc-tuned\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-var-lib-kubelet\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-kubernetes\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysctl-conf\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-sys\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48670ed3-db9b-4299-a9b9-270bf4c32561-serviceca\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:57.067087 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.066830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/021801b3-5414-43aa-8164-e9365f642f91-etc-sysconfig\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.068898 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.068880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/021801b3-5414-43aa-8164-e9365f642f91-tmp\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.069004 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.068984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/021801b3-5414-43aa-8164-e9365f642f91-etc-tuned\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.074900 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.074881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hjpm\" (UniqueName: \"kubernetes.io/projected/48670ed3-db9b-4299-a9b9-270bf4c32561-kube-api-access-5hjpm\") pod \"node-ca-tqkpr\" (UID: \"48670ed3-db9b-4299-a9b9-270bf4c32561\") " pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:57.075013 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.074948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzlzn\" (UniqueName: \"kubernetes.io/projected/021801b3-5414-43aa-8164-e9365f642f91-kube-api-access-qzlzn\") pod \"tuned-ldkbb\" (UID: \"021801b3-5414-43aa-8164-e9365f642f91\") " pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.153980 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.153945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rzjbf" Apr 21 03:56:57.163623 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.163601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:56:57.172335 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.172311 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j8nd9" Apr 21 03:56:57.176861 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.176843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-phc24" Apr 21 03:56:57.184490 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.184472 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:56:57.191022 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.191002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zgz2x" Apr 21 03:56:57.198413 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.198397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" Apr 21 03:56:57.204898 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.204881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" Apr 21 03:56:57.209331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.209315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tqkpr" Apr 21 03:56:57.284097 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.284043 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:56:57.468831 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.468796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:56:57.468976 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.468840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:57.468976 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:57.468936 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:56:57.468976 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:57.468975 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:56:57.469115 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:57.468989 2575 projected.go:194] Error preparing data for projected volume kube-api-access-n9prw for pod openshift-network-diagnostics/network-check-target-9cslm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:56:57.469115 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:57.468934 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:56:57.469115 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:57.469046 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw podName:541c9c2c-7803-469b-9f76-fa3ec6995458 nodeName:}" failed. No retries permitted until 2026-04-21 03:56:58.46902793 +0000 UTC m=+4.133617997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n9prw" (UniqueName: "kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw") pod "network-check-target-9cslm" (UID: "541c9c2c-7803-469b-9f76-fa3ec6995458") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:56:57.469115 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:57.469104 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:56:58.469080089 +0000 UTC m=+4.133670154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:56:57.650939 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:57.650915 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89bfd36f_311d_477d_a1ba_cc9a2854b55d.slice/crio-85a76860255790382abe6bad65c11242d2a5b0353105bf6a0825c274c6959ef4 WatchSource:0}: Error finding container 85a76860255790382abe6bad65c11242d2a5b0353105bf6a0825c274c6959ef4: Status 404 returned error can't find the container with id 85a76860255790382abe6bad65c11242d2a5b0353105bf6a0825c274c6959ef4 Apr 21 03:56:57.652284 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:57.652245 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod021801b3_5414_43aa_8164_e9365f642f91.slice/crio-0050f3ca7d7bcdd58ceb41b991736305e560a6fc5cfb3f5b14c75bfd43e8ee4a WatchSource:0}: Error finding container 0050f3ca7d7bcdd58ceb41b991736305e560a6fc5cfb3f5b14c75bfd43e8ee4a: Status 404 returned error can't find the container with id 0050f3ca7d7bcdd58ceb41b991736305e560a6fc5cfb3f5b14c75bfd43e8ee4a Apr 21 03:56:57.656474 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:57.656437 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod332698b5_e816_48f5_806f_295e3ed3f8fb.slice/crio-2a8cbe3675153b354835c0f9b5191ab7e6dc47d1b9ed7ace36131f4b39ac11c1 WatchSource:0}: Error finding container 2a8cbe3675153b354835c0f9b5191ab7e6dc47d1b9ed7ace36131f4b39ac11c1: Status 404 returned error can't find the container with id 2a8cbe3675153b354835c0f9b5191ab7e6dc47d1b9ed7ace36131f4b39ac11c1 Apr 21 03:56:57.657109 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:57.657045 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa5d6fee_c188_4cee_a9dd_cc90927bef31.slice/crio-6802fa62d045f4221523d7b89b856a3a9cd22c44042dbd12af8d2ba586bb2616 WatchSource:0}: Error finding container 6802fa62d045f4221523d7b89b856a3a9cd22c44042dbd12af8d2ba586bb2616: Status 404 returned error can't find the container with id 6802fa62d045f4221523d7b89b856a3a9cd22c44042dbd12af8d2ba586bb2616 Apr 21 03:56:57.657726 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:57.657695 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14f705da_fc93_4286_8c99_eb46f7420053.slice/crio-e4ca2b7c88a86c620c3ab36f30bb6ba77a7f51886e708af505f88d33f048d3c6 WatchSource:0}: Error finding container e4ca2b7c88a86c620c3ab36f30bb6ba77a7f51886e708af505f88d33f048d3c6: Status 404 returned error can't find the container with id e4ca2b7c88a86c620c3ab36f30bb6ba77a7f51886e708af505f88d33f048d3c6 Apr 21 03:56:57.679177 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:57.678989 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9acdf950_6bdd_4903_943b_90a6f96b5271.slice/crio-46eaf1eaf17c468052f40fc485c6f6bc112904f405a0dc4d970b18f0ffb506f8 WatchSource:0}: Error finding container 46eaf1eaf17c468052f40fc485c6f6bc112904f405a0dc4d970b18f0ffb506f8: Status 404 returned error can't find the container with id 46eaf1eaf17c468052f40fc485c6f6bc112904f405a0dc4d970b18f0ffb506f8 Apr 21 03:56:57.680938 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:56:57.680913 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod494fdf31_01dd_419f_a5be_8ea679099b8e.slice/crio-bc02fa8274b4e3b1b195905b0b87ccaa439300503be7736c45928c92d34480ac WatchSource:0}: Error finding container bc02fa8274b4e3b1b195905b0b87ccaa439300503be7736c45928c92d34480ac: Status 404 returned error can't find the container with id bc02fa8274b4e3b1b195905b0b87ccaa439300503be7736c45928c92d34480ac Apr 21 03:56:57.885724 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.885691 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:51:55 +0000 UTC" deadline="2027-12-01 07:06:08.490779337 +0000 UTC" Apr 21 03:56:57.885724 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:57.885720 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14139h9m10.605061304s" Apr 21 03:56:58.023321 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.023212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:58.023321 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.023254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:56:58.023513 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.023345 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:56:58.023513 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.023408 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:56:58.032542 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.032512 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zgz2x" event={"ID":"494fdf31-01dd-419f-a5be-8ea679099b8e","Type":"ContainerStarted","Data":"bc02fa8274b4e3b1b195905b0b87ccaa439300503be7736c45928c92d34480ac"} Apr 21 03:56:58.036653 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.036623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"46eaf1eaf17c468052f40fc485c6f6bc112904f405a0dc4d970b18f0ffb506f8"} Apr 21 03:56:58.043124 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.043080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j8nd9" event={"ID":"89bfd36f-311d-477d-a1ba-cc9a2854b55d","Type":"ContainerStarted","Data":"85a76860255790382abe6bad65c11242d2a5b0353105bf6a0825c274c6959ef4"} Apr 21 03:56:58.047869 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.047840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" event={"ID":"cb235a448aa33328082b3f09d40d9023","Type":"ContainerStarted","Data":"b6551b8fa8bbd635d28387790e545b25867707dbf592d7509bd972d27951f3fd"} Apr 21 03:56:58.051624 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.051597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tqkpr" event={"ID":"48670ed3-db9b-4299-a9b9-270bf4c32561","Type":"ContainerStarted","Data":"77f803773ecf3647c7c0a9dfeaa9a815fc6c94af56c3b0c52c116dce6991f80b"} Apr 21 03:56:58.054272 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.054247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" event={"ID":"5b202830-ba31-44af-9eb0-ac33aeef57d1","Type":"ContainerStarted","Data":"bd101c1790b86d547cd60a85f99a4b3d5179d5edca59351ae1fa3ab99a7e8dae"} Apr 21 03:56:58.058017 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.057992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rzjbf" event={"ID":"14f705da-fc93-4286-8c99-eb46f7420053","Type":"ContainerStarted","Data":"e4ca2b7c88a86c620c3ab36f30bb6ba77a7f51886e708af505f88d33f048d3c6"} Apr 21 03:56:58.064435 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.064379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerStarted","Data":"6802fa62d045f4221523d7b89b856a3a9cd22c44042dbd12af8d2ba586bb2616"} Apr 21 03:56:58.068215 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.068156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6w2n8" event={"ID":"332698b5-e816-48f5-806f-295e3ed3f8fb","Type":"ContainerStarted","Data":"2a8cbe3675153b354835c0f9b5191ab7e6dc47d1b9ed7ace36131f4b39ac11c1"} Apr 21 03:56:58.070491 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.070428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" event={"ID":"021801b3-5414-43aa-8164-e9365f642f91","Type":"ContainerStarted","Data":"0050f3ca7d7bcdd58ceb41b991736305e560a6fc5cfb3f5b14c75bfd43e8ee4a"} Apr 21 03:56:58.475086 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.475007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:56:58.475086 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:58.475057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:56:58.475305 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.475183 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:56:58.475305 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.475241 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:00.475222015 +0000 UTC m=+6.139812081 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:56:58.475411 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.475325 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:56:58.475411 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.475341 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:56:58.475411 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.475354 2575 projected.go:194] Error preparing data for projected volume kube-api-access-n9prw for pod openshift-network-diagnostics/network-check-target-9cslm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:56:58.475411 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:56:58.475391 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw podName:541c9c2c-7803-469b-9f76-fa3ec6995458 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:00.475378973 +0000 UTC m=+6.139969037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n9prw" (UniqueName: "kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw") pod "network-check-target-9cslm" (UID: "541c9c2c-7803-469b-9f76-fa3ec6995458") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:56:59.086288 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:59.085086 2575 generic.go:358] "Generic (PLEG): container finished" podID="71f54f10b0458b533d193f0621e7e37f" containerID="f86be07c97dfa5bdb703e01cb19f51094f8d8dc964814a8a422f7942e2ac8341" exitCode=0 Apr 21 03:56:59.086288 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:59.085970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" event={"ID":"71f54f10b0458b533d193f0621e7e37f","Type":"ContainerDied","Data":"f86be07c97dfa5bdb703e01cb19f51094f8d8dc964814a8a422f7942e2ac8341"} Apr 21 03:56:59.103563 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:56:59.102582 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-15.ec2.internal" podStartSLOduration=3.102564853 podStartE2EDuration="3.102564853s" podCreationTimestamp="2026-04-21 03:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:56:58.067309775 +0000 UTC m=+3.731899862" watchObservedRunningTime="2026-04-21 03:56:59.102564853 +0000 UTC m=+4.767154940" Apr 21 03:57:00.023334 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.023298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:00.023508 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.023446 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:00.023860 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.023841 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:00.023956 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.023932 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:00.096728 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.096115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" event={"ID":"71f54f10b0458b533d193f0621e7e37f","Type":"ContainerStarted","Data":"7573cf06828d91649da60705c5640f9f9fd3f79c3a7d014e58f5911fbc24029a"} Apr 21 03:57:00.132770 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.129245 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-15.ec2.internal" podStartSLOduration=4.129224628 podStartE2EDuration="4.129224628s" podCreationTimestamp="2026-04-21 03:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:00.127876619 +0000 UTC m=+5.792466705" watchObservedRunningTime="2026-04-21 03:57:00.129224628 +0000 UTC m=+5.793814921" Apr 21 03:57:00.137747 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.137608 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lc86l"] Apr 21 03:57:00.144222 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.144195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.144416 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.144395 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:00.189249 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.189164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-dbus\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.189249 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.189219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-kubelet-config\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.189430 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.189287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.290440 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.290354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-dbus\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.290440 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.290416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-kubelet-config\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.290643 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.290444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.290643 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.290601 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:00.290739 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.290657 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret podName:9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:00.790639576 +0000 UTC m=+6.455229648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret") pod "global-pull-secret-syncer-lc86l" (UID: "9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:00.291173 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.291014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-dbus\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.291173 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.291089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-kubelet-config\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.492367 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.492335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:00.492367 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.492378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:00.492687 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.492507 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:00.492687 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.492561 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:04.492543619 +0000 UTC m=+10.157133696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:00.492687 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.492644 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:00.492687 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.492660 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:00.492687 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.492672 2575 projected.go:194] Error preparing data for projected volume kube-api-access-n9prw for pod openshift-network-diagnostics/network-check-target-9cslm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:00.492987 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.492709 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw podName:541c9c2c-7803-469b-9f76-fa3ec6995458 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:04.492696397 +0000 UTC m=+10.157286461 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n9prw" (UniqueName: "kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw") pod "network-check-target-9cslm" (UID: "541c9c2c-7803-469b-9f76-fa3ec6995458") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:00.794354 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:00.794316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:00.794505 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.794460 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:00.794571 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:00.794536 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret podName:9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:01.794515699 +0000 UTC m=+7.459105780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret") pod "global-pull-secret-syncer-lc86l" (UID: "9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:01.800949 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:01.800839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:01.800949 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:01.800943 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:01.801449 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:01.801003 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret podName:9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:03.80098889 +0000 UTC m=+9.465578953 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret") pod "global-pull-secret-syncer-lc86l" (UID: "9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:02.023964 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:02.023207 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:02.023964 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:02.023339 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:02.023964 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:02.023708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:02.023964 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:02.023831 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:02.023964 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:02.023871 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:02.023964 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:02.023931 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:03.818293 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:03.818254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:03.818750 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:03.818394 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:03.818750 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:03.818475 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret podName:9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:07.818454816 +0000 UTC m=+13.483044886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret") pod "global-pull-secret-syncer-lc86l" (UID: "9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:04.023562 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:04.023531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:04.023735 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:04.023570 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:04.023735 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:04.023547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:04.023735 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.023659 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:04.023918 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.023794 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:04.023918 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.023869 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:04.523363 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:04.523111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:04.523536 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:04.523390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:04.523536 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.523297 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:04.523536 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.523469 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:04.523536 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.523479 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:04.523536 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.523485 2575 projected.go:194] Error preparing data for projected volume kube-api-access-n9prw for pod openshift-network-diagnostics/network-check-target-9cslm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:04.523713 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.523585 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:12.523516978 +0000 UTC m=+18.188107043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:04.523713 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:04.523612 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw podName:541c9c2c-7803-469b-9f76-fa3ec6995458 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:12.523595568 +0000 UTC m=+18.188185636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n9prw" (UniqueName: "kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw") pod "network-check-target-9cslm" (UID: "541c9c2c-7803-469b-9f76-fa3ec6995458") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:06.024448 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:06.023973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:06.024448 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:06.024102 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:06.024448 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:06.023980 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:06.024448 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:06.024233 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:06.024448 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:06.024289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:06.024448 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:06.024394 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:07.849577 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:07.849543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:07.850014 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:07.849700 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:07.850014 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:07.849779 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret podName:9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:15.849747963 +0000 UTC m=+21.514338033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret") pod "global-pull-secret-syncer-lc86l" (UID: "9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:08.023705 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:08.023676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:08.023897 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:08.023676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:08.023897 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:08.023807 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:08.023897 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:08.023677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:08.023897 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:08.023860 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:08.024105 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:08.023945 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:10.023577 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:10.023547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:10.023959 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:10.023641 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:10.023959 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:10.023653 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:10.023959 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:10.023666 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:10.023959 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:10.023737 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:10.023959 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:10.023823 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:12.023449 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:12.023413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:12.023869 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:12.023481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:12.023869 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:12.023514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:12.023869 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.023604 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:12.023869 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.023635 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:12.023869 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.023739 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:12.587002 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:12.586970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:12.587002 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:12.587009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:12.587216 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.587113 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:12.587216 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.587139 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:12.587216 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.587162 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:12.587216 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.587175 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:28.587156669 +0000 UTC m=+34.251746735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:12.587216 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.587175 2575 projected.go:194] Error preparing data for projected volume kube-api-access-n9prw for pod openshift-network-diagnostics/network-check-target-9cslm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:12.587216 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:12.587212 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw podName:541c9c2c-7803-469b-9f76-fa3ec6995458 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:28.587205847 +0000 UTC m=+34.251795911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n9prw" (UniqueName: "kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw") pod "network-check-target-9cslm" (UID: "541c9c2c-7803-469b-9f76-fa3ec6995458") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:14.024075 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:14.024042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:14.024525 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:14.024042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:14.024525 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:14.024042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:14.024525 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:14.024224 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:14.024525 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:14.024142 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:14.024525 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:14.024312 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:15.121713 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.121670 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j8nd9" event={"ID":"89bfd36f-311d-477d-a1ba-cc9a2854b55d","Type":"ContainerStarted","Data":"5332d9064c5016eff42558fe039d89799d9bfdeee897cca0e084b7dd2961089e"} Apr 21 03:57:15.124022 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.123746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tqkpr" event={"ID":"48670ed3-db9b-4299-a9b9-270bf4c32561","Type":"ContainerStarted","Data":"41dacaaf44f32204363b4ecf7686bee9e68ce75fb50656fb6575c978af1c93a7"} Apr 21 03:57:15.125495 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.125406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6w2n8" event={"ID":"332698b5-e816-48f5-806f-295e3ed3f8fb","Type":"ContainerStarted","Data":"1ca3d1c1a96f0baaf1dadb2ea02ddb7726148dcd60bad2292595d8da17595ba5"} Apr 21 03:57:15.127152 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.126949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" event={"ID":"021801b3-5414-43aa-8164-e9365f642f91","Type":"ContainerStarted","Data":"5c72faa5786a85bc7f528cbfac83f15ec9ea0ea78fa1d1411f35b463e4ae46e6"} Apr 21 03:57:15.186964 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.186910 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6w2n8" podStartSLOduration=3.044888109 podStartE2EDuration="20.186896374s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.677539485 +0000 UTC m=+3.342129568" lastFinishedPulling="2026-04-21 03:57:14.819547768 +0000 UTC m=+20.484137833" observedRunningTime="2026-04-21 03:57:15.186677159 +0000 UTC m=+20.851267244" watchObservedRunningTime="2026-04-21 03:57:15.186896374 +0000 UTC m=+20.851486457" Apr 21 03:57:15.187050 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.186991 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j8nd9" podStartSLOduration=3.044719393 podStartE2EDuration="20.18698573s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.652750567 +0000 UTC m=+3.317340646" lastFinishedPulling="2026-04-21 03:57:14.79501691 +0000 UTC m=+20.459606983" observedRunningTime="2026-04-21 03:57:15.15102691 +0000 UTC m=+20.815616998" watchObservedRunningTime="2026-04-21 03:57:15.18698573 +0000 UTC m=+20.851575809" Apr 21 03:57:15.220784 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.220287 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ldkbb" podStartSLOduration=3.02987404 podStartE2EDuration="20.2202739s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.655250875 +0000 UTC m=+3.319840941" lastFinishedPulling="2026-04-21 03:57:14.845650728 +0000 UTC m=+20.510240801" observedRunningTime="2026-04-21 03:57:15.205290767 +0000 UTC m=+20.869880853" watchObservedRunningTime="2026-04-21 03:57:15.2202739 +0000 UTC m=+20.884863986" Apr 21 03:57:15.220784 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.220373 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tqkpr" podStartSLOduration=11.300565923 podStartE2EDuration="20.220366588s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.686107946 +0000 UTC m=+3.350698010" lastFinishedPulling="2026-04-21 03:57:06.605908607 +0000 UTC m=+12.270498675" observedRunningTime="2026-04-21 03:57:15.22021108 +0000 UTC m=+20.884801165" watchObservedRunningTime="2026-04-21 03:57:15.220366588 +0000 UTC m=+20.884956674" Apr 21 03:57:15.916359 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.916059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:15.916467 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:15.916361 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:15.916467 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:15.916437 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret podName:9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:31.916415118 +0000 UTC m=+37.581005209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret") pod "global-pull-secret-syncer-lc86l" (UID: "9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:15.973480 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:15.973450 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 03:57:16.023978 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.023958 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:16.024052 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.023985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:16.024096 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:16.024060 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:16.024133 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:16.024112 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:16.024164 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.024148 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:16.024230 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:16.024215 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:16.130346 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.130290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" event={"ID":"5b202830-ba31-44af-9eb0-ac33aeef57d1","Type":"ContainerStarted","Data":"13e34b4fae6ff6f38a6cf7d7961167f81cd5788b11fa0f7a1bce4aa113de9f00"} Apr 21 03:57:16.130346 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.130316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" event={"ID":"5b202830-ba31-44af-9eb0-ac33aeef57d1","Type":"ContainerStarted","Data":"f98e34abdbe7f98a1eeb93f1892154fc8ead456e524f5abc9ad35b3b84b9da11"} Apr 21 03:57:16.131635 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.131615 2575 generic.go:358] "Generic (PLEG): container finished" podID="aa5d6fee-c188-4cee-a9dd-cc90927bef31" containerID="912efef37963b6277fec883d783e2bc38ef018c721e7f433f6f8375dc4579a84" exitCode=0 Apr 21 03:57:16.131723 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.131678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerDied","Data":"912efef37963b6277fec883d783e2bc38ef018c721e7f433f6f8375dc4579a84"} Apr 21 03:57:16.133047 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.133011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zgz2x" event={"ID":"494fdf31-01dd-419f-a5be-8ea679099b8e","Type":"ContainerStarted","Data":"ca2995f1cdb576f2552f931ec1a28b7bca9960514d3eacdb550f0096d0910f79"} Apr 21 03:57:16.135584 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.135561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"8da6ae519060e2aeecd4db4ebf74ac5f93d31885038ca5669105ffcedf704ed1"} Apr 21 03:57:16.135691 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.135589 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"14337ecddae1018b5d3c4c17af493b7c996f8c24d6798831167aafe14d9c9563"} Apr 21 03:57:16.135691 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.135599 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"2ad9ff85db297867a02eb4f7c9ffce3add10a8e576770bb1176de5402f672b78"} Apr 21 03:57:16.135691 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.135610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"fb55ada1afcb6d4e390ed1c7adc09adf045fa83fe1a5ed30492ec591e5545cd0"} Apr 21 03:57:16.135691 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.135624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"97ba31eb79895a6a557708cfa6094ef88b41caa225067d72980a08ab28b08056"} Apr 21 03:57:16.135691 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.135637 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"73420abfe01df41152fddd724c958ef2131619b2cc34a98135ece0c49905c57b"} Apr 21 03:57:16.179461 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.179422 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zgz2x" podStartSLOduration=3.761114481 podStartE2EDuration="21.179411946s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.686271744 +0000 UTC m=+3.350861815" lastFinishedPulling="2026-04-21 03:57:15.104569203 +0000 UTC m=+20.769159280" observedRunningTime="2026-04-21 03:57:16.178347847 +0000 UTC m=+21.842937933" watchObservedRunningTime="2026-04-21 03:57:16.179411946 +0000 UTC m=+21.844002032" Apr 21 03:57:16.928620 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.928515 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T03:57:15.97347614Z","UUID":"ef25cf6a-81c2-45f4-b174-1e6882aa567f","Handler":null,"Name":"","Endpoint":""} Apr 21 03:57:16.930939 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.930915 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 03:57:16.931083 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:16.930946 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 03:57:17.139212 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:17.139179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rzjbf" event={"ID":"14f705da-fc93-4286-8c99-eb46f7420053","Type":"ContainerStarted","Data":"502f299ab9969aa04069c813a0f45fba24a4fd4c5127595fd19179f0c09cec71"} Apr 21 03:57:18.001192 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.001111 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:57:18.023682 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.023649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:18.023851 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.023654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:18.023851 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:18.023786 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:18.023851 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.023801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:18.024020 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:18.023877 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:18.024020 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:18.023941 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:18.143534 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.143494 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" event={"ID":"5b202830-ba31-44af-9eb0-ac33aeef57d1","Type":"ContainerStarted","Data":"064869afb904d253f040cd75e3459b6b08c1118ed4dbe00e9deb798fd2913d92"} Apr 21 03:57:18.146691 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.146666 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"c8afbf48d42721c6bc645d68cb3e6c4c26cd7c66d8268639433b7ed02aa8adab"} Apr 21 03:57:18.161504 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.161465 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rzjbf" podStartSLOduration=6.043859132 podStartE2EDuration="23.161453703s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.677420775 +0000 UTC m=+3.342010844" lastFinishedPulling="2026-04-21 03:57:14.795015351 +0000 UTC m=+20.459605415" observedRunningTime="2026-04-21 03:57:17.15421901 +0000 UTC m=+22.818809097" watchObservedRunningTime="2026-04-21 03:57:18.161453703 +0000 UTC m=+23.826043789" Apr 21 03:57:18.842578 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.842543 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:57:18.843226 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.843204 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:57:18.860188 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:18.860150 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hwcws" podStartSLOduration=4.416050495 podStartE2EDuration="23.860133496s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.685969103 +0000 UTC m=+3.350559173" lastFinishedPulling="2026-04-21 03:57:17.130052089 +0000 UTC m=+22.794642174" observedRunningTime="2026-04-21 03:57:18.162736689 +0000 UTC m=+23.827326775" watchObservedRunningTime="2026-04-21 03:57:18.860133496 +0000 UTC m=+24.524723584" Apr 21 03:57:19.149668 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:19.149596 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6w2n8" Apr 21 03:57:20.023331 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.023174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:20.023426 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.023204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:20.023552 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.023215 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:20.023634 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:20.023406 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:20.023684 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:20.023627 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:20.023684 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:20.023537 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:20.155253 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.155220 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" event={"ID":"9acdf950-6bdd-4903-943b-90a6f96b5271","Type":"ContainerStarted","Data":"cee8a8ecf64e00c54d79eda56271e295123bd5d29096a8a299562bd589f0e2ef"} Apr 21 03:57:20.155694 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.155673 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:57:20.155864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.155831 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:57:20.155864 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.155861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:57:20.174297 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.174260 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:57:20.174933 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.174910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:57:20.188594 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:20.188541 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" podStartSLOduration=7.98193038 podStartE2EDuration="25.188523668s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.685894618 +0000 UTC m=+3.350484688" lastFinishedPulling="2026-04-21 03:57:14.892487911 +0000 UTC m=+20.557077976" observedRunningTime="2026-04-21 03:57:20.188310434 +0000 UTC m=+25.852900532" watchObservedRunningTime="2026-04-21 03:57:20.188523668 +0000 UTC m=+25.853113757" Apr 21 03:57:21.158775 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:21.158720 2575 generic.go:358] "Generic (PLEG): container finished" podID="aa5d6fee-c188-4cee-a9dd-cc90927bef31" containerID="6bc796d5a2650ab560d414500fb1f976f2bf91b05e586cdc978090c796763c32" exitCode=0 Apr 21 03:57:21.159200 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:21.158809 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerDied","Data":"6bc796d5a2650ab560d414500fb1f976f2bf91b05e586cdc978090c796763c32"} Apr 21 03:57:21.999653 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:21.999426 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gpfrt"] Apr 21 03:57:21.999790 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:21.999702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:21.999853 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:21.999828 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:22.002709 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:22.002682 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9cslm"] Apr 21 03:57:22.002829 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:22.002817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:22.002936 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:22.002915 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:22.003366 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:22.003346 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lc86l"] Apr 21 03:57:22.003454 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:22.003448 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:22.003557 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:22.003535 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:22.162068 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:22.162041 2575 generic.go:358] "Generic (PLEG): container finished" podID="aa5d6fee-c188-4cee-a9dd-cc90927bef31" containerID="d110be877a8a1c8dcc89affb34463a21b8bf6025fa2a19a53e010f2af5a38930" exitCode=0 Apr 21 03:57:22.162423 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:22.162131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerDied","Data":"d110be877a8a1c8dcc89affb34463a21b8bf6025fa2a19a53e010f2af5a38930"} Apr 21 03:57:23.165639 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:23.165550 2575 generic.go:358] "Generic (PLEG): container finished" podID="aa5d6fee-c188-4cee-a9dd-cc90927bef31" containerID="2af23914765389b0a7c541bf1f077ebaa5598756e5a79344c3cfabd9a7b6824d" exitCode=0 Apr 21 03:57:23.165639 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:23.165593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerDied","Data":"2af23914765389b0a7c541bf1f077ebaa5598756e5a79344c3cfabd9a7b6824d"} Apr 21 03:57:24.023689 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:24.023415 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:24.023689 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:24.023569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:24.023689 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:24.023669 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:24.023689 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:24.023570 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:24.023689 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:24.023569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:24.024067 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:24.023794 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:26.023672 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:26.023639 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:26.024427 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:26.023675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:26.024427 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:26.023681 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:26.024427 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:26.023747 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cslm" podUID="541c9c2c-7803-469b-9f76-fa3ec6995458" Apr 21 03:57:26.024427 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:26.023886 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpfrt" podUID="a42d1f99-c5af-45d0-9ce8-8affe0d01ea4" Apr 21 03:57:26.024427 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:26.023990 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lc86l" podUID="9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3" Apr 21 03:57:27.682431 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.682404 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-15.ec2.internal" event="NodeReady" Apr 21 03:57:27.682851 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.682553 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 03:57:27.719376 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.719349 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64f7f94fbd-b7gks"] Apr 21 03:57:27.745417 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.745393 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hz28z"] Apr 21 03:57:27.745592 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.745556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.749094 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.748925 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 03:57:27.749094 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.748968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 03:57:27.749792 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.749569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 03:57:27.749792 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.749600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ntv6f\"" Apr 21 03:57:27.755038 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.755016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 03:57:27.760448 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.760429 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64f7f94fbd-b7gks"] Apr 21 03:57:27.760557 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.760457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hz28z"] Apr 21 03:57:27.760557 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.760541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:27.763150 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.763132 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 03:57:27.763244 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.763148 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 03:57:27.763244 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.763167 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lrpqk\"" Apr 21 03:57:27.830114 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.830050 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6kgc9"] Apr 21 03:57:27.846068 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.846006 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6kgc9"] Apr 21 03:57:27.846068 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.846038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:27.849670 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.849306 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 03:57:27.849670 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.849506 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4vtdx\"" Apr 21 03:57:27.849670 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.849564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 03:57:27.851116 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.851087 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 03:57:27.908116 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7fc9465-0576-4a91-ba4c-913400d12eb3-tmp-dir\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:27.908268 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srt7v\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-kube-api-access-srt7v\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908268 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-bound-sa-token\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908268 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-installation-pull-secrets\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908268 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-image-registry-private-configuration\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908472 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908472 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxlk\" (UniqueName: \"kubernetes.io/projected/a7fc9465-0576-4a91-ba4c-913400d12eb3-kube-api-access-9bxlk\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:27.908472 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-trusted-ca\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908472 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-ca-trust-extracted\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908472 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-certificates\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:27.908472 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:27.908736 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:27.908503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7fc9465-0576-4a91-ba4c-913400d12eb3-config-volume\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.009519 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-ca-trust-extracted\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.009674 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009530 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-certificates\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.009674 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.009674 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:28.009674 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7fc9465-0576-4a91-ba4c-913400d12eb3-config-volume\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.009674 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnskr\" (UniqueName: \"kubernetes.io/projected/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-kube-api-access-bnskr\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7fc9465-0576-4a91-ba4c-913400d12eb3-tmp-dir\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srt7v\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-kube-api-access-srt7v\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.009735 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.009825 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls podName:a7fc9465-0576-4a91-ba4c-913400d12eb3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:28.509803061 +0000 UTC m=+34.174393144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls") pod "dns-default-hz28z" (UID: "a7fc9465-0576-4a91-ba4c-913400d12eb3") : secret "dns-default-metrics-tls" not found Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-bound-sa-token\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-installation-pull-secrets\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-image-registry-private-configuration\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxlk\" (UniqueName: \"kubernetes.io/projected/a7fc9465-0576-4a91-ba4c-913400d12eb3-kube-api-access-9bxlk\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.009993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-ca-trust-extracted\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.010010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-trusted-ca\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.010095 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.010107 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f7f94fbd-b7gks: secret "image-registry-tls" not found Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.010153 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls podName:a3b6160a-423d-45d3-b52f-9933a1bfdbc6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:28.510138729 +0000 UTC m=+34.174728798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls") pod "image-registry-64f7f94fbd-b7gks" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6") : secret "image-registry-tls" not found Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.010152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7fc9465-0576-4a91-ba4c-913400d12eb3-tmp-dir\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.010323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-certificates\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.010572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.010406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7fc9465-0576-4a91-ba4c-913400d12eb3-config-volume\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.011311 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.011044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-trusted-ca\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.014881 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.014849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-image-registry-private-configuration\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.014881 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.014870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-installation-pull-secrets\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.019098 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.019074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srt7v\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-kube-api-access-srt7v\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.019547 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.019525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxlk\" (UniqueName: \"kubernetes.io/projected/a7fc9465-0576-4a91-ba4c-913400d12eb3-kube-api-access-9bxlk\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.019634 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.019619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-bound-sa-token\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.023563 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.023542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:28.023664 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.023585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:28.023664 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.023594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:28.026533 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.026455 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:57:28.026533 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.026481 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rcf6j\"" Apr 21 03:57:28.026708 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.026578 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tn8dn\"" Apr 21 03:57:28.026708 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.026703 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:57:28.026860 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.026805 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:57:28.026958 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.026935 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 03:57:28.111225 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.111139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnskr\" (UniqueName: \"kubernetes.io/projected/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-kube-api-access-bnskr\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:28.111337 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.111238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:28.111399 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.111333 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:28.111399 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.111394 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert podName:8b7373a9-03be-47d1-9a03-d92ce2e99f2a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:28.611374356 +0000 UTC m=+34.275964428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert") pod "ingress-canary-6kgc9" (UID: "8b7373a9-03be-47d1-9a03-d92ce2e99f2a") : secret "canary-serving-cert" not found Apr 21 03:57:28.120459 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.120428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnskr\" (UniqueName: \"kubernetes.io/projected/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-kube-api-access-bnskr\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:28.499227 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.499078 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd"] Apr 21 03:57:28.515125 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.515101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:28.515282 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.515158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:28.515282 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.515259 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:28.515282 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.515281 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f7f94fbd-b7gks: secret "image-registry-tls" not found Apr 21 03:57:28.515433 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.515326 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:28.515433 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.515360 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls podName:a3b6160a-423d-45d3-b52f-9933a1bfdbc6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:29.515340123 +0000 UTC m=+35.179930210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls") pod "image-registry-64f7f94fbd-b7gks" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6") : secret "image-registry-tls" not found Apr 21 03:57:28.515433 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.515382 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls podName:a7fc9465-0576-4a91-ba4c-913400d12eb3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:29.515366823 +0000 UTC m=+35.179956901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls") pod "dns-default-hz28z" (UID: "a7fc9465-0576-4a91-ba4c-913400d12eb3") : secret "dns-default-metrics-tls" not found Apr 21 03:57:28.528788 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.528768 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95"] Apr 21 03:57:28.528919 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.528899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.531828 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.531620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 03:57:28.531828 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.531722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 03:57:28.532001 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.531846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 03:57:28.533252 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.533233 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 03:57:28.533364 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.533285 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fnddc\"" Apr 21 03:57:28.542231 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.542213 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd"] Apr 21 03:57:28.542330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.542237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95"] Apr 21 03:57:28.542330 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.542253 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz"] Apr 21 03:57:28.542444 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.542356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.545289 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.545271 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 03:57:28.545388 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.545289 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 03:57:28.545388 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.545289 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 03:57:28.545388 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.545274 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 03:57:28.564017 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.563998 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz"] Apr 21 03:57:28.564108 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.564080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.566690 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.566673 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 03:57:28.616136 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.616114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:28.616243 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.616147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/762e77d2-f995-4355-832f-606b0cf4f9e6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67d4f68f6f-54jxd\" (UID: \"762e77d2-f995-4355-832f-606b0cf4f9e6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.616243 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.616184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chf2\" (UniqueName: \"kubernetes.io/projected/762e77d2-f995-4355-832f-606b0cf4f9e6-kube-api-access-7chf2\") pod \"managed-serviceaccount-addon-agent-67d4f68f6f-54jxd\" (UID: \"762e77d2-f995-4355-832f-606b0cf4f9e6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.616243 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.616228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:28.616385 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.616257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:57:28.616385 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.616324 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:28.616385 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.616350 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:57:28.616520 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.616407 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.616391052 +0000 UTC m=+66.280981115 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : secret "metrics-daemon-secret" not found Apr 21 03:57:28.616520 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:28.616420 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert podName:8b7373a9-03be-47d1-9a03-d92ce2e99f2a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:29.616414149 +0000 UTC m=+35.281004213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert") pod "ingress-canary-6kgc9" (UID: "8b7373a9-03be-47d1-9a03-d92ce2e99f2a") : secret "canary-serving-cert" not found Apr 21 03:57:28.618673 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.618654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9prw\" (UniqueName: \"kubernetes.io/projected/541c9c2c-7803-469b-9f76-fa3ec6995458-kube-api-access-n9prw\") pod \"network-check-target-9cslm\" (UID: \"541c9c2c-7803-469b-9f76-fa3ec6995458\") " pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:28.649635 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.649613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:28.716740 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.716710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrq9z\" (UniqueName: \"kubernetes.io/projected/888bac5f-e281-4267-ae08-fce73b1f965e-kube-api-access-lrq9z\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.716778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-tmp\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.716808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/888bac5f-e281-4267-ae08-fce73b1f965e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.716839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-hub\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.716895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762hv\" (UniqueName: \"kubernetes.io/projected/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-kube-api-access-762hv\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.716920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-ca\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.717051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/762e77d2-f995-4355-832f-606b0cf4f9e6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67d4f68f6f-54jxd\" (UID: \"762e77d2-f995-4355-832f-606b0cf4f9e6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.717091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-klusterlet-config\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.717136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7chf2\" (UniqueName: \"kubernetes.io/projected/762e77d2-f995-4355-832f-606b0cf4f9e6-kube-api-access-7chf2\") pod \"managed-serviceaccount-addon-agent-67d4f68f6f-54jxd\" (UID: \"762e77d2-f995-4355-832f-606b0cf4f9e6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.717163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.717205 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.717194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.719774 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.719738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/762e77d2-f995-4355-832f-606b0cf4f9e6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67d4f68f6f-54jxd\" (UID: \"762e77d2-f995-4355-832f-606b0cf4f9e6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.725507 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.725488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chf2\" (UniqueName: \"kubernetes.io/projected/762e77d2-f995-4355-832f-606b0cf4f9e6-kube-api-access-7chf2\") pod \"managed-serviceaccount-addon-agent-67d4f68f6f-54jxd\" (UID: \"762e77d2-f995-4355-832f-606b0cf4f9e6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.818197 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrq9z\" (UniqueName: \"kubernetes.io/projected/888bac5f-e281-4267-ae08-fce73b1f965e-kube-api-access-lrq9z\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.818299 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-tmp\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.818299 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818231 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/888bac5f-e281-4267-ae08-fce73b1f965e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.818299 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-hub\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.818299 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-762hv\" (UniqueName: \"kubernetes.io/projected/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-kube-api-access-762hv\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.818485 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-ca\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.818485 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-klusterlet-config\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.818485 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.818636 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.818636 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.818623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-tmp\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.820833 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.820809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.821160 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.821139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-ca\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.821279 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.821264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-klusterlet-config\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.821448 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.821430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-hub\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.821488 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.821435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/888bac5f-e281-4267-ae08-fce73b1f965e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.824114 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.824101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/888bac5f-e281-4267-ae08-fce73b1f965e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.831851 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.831832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrq9z\" (UniqueName: \"kubernetes.io/projected/888bac5f-e281-4267-ae08-fce73b1f965e-kube-api-access-lrq9z\") pod \"cluster-proxy-proxy-agent-699587cf6f-ffp95\" (UID: \"888bac5f-e281-4267-ae08-fce73b1f965e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.831937 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.831862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-762hv\" (UniqueName: \"kubernetes.io/projected/4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5-kube-api-access-762hv\") pod \"klusterlet-addon-workmgr-7485454c8-7drvz\" (UID: \"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.838412 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.838379 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9cslm"] Apr 21 03:57:28.849854 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.849830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" Apr 21 03:57:28.858463 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.858440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:57:28.891031 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:28.891013 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:28.935077 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:57:28.935053 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod541c9c2c_7803_469b_9f76_fa3ec6995458.slice/crio-3b4082194aea9f284a808f49b5d30de10f4a49ffa323b1fe3f1adcf403ecb619 WatchSource:0}: Error finding container 3b4082194aea9f284a808f49b5d30de10f4a49ffa323b1fe3f1adcf403ecb619: Status 404 returned error can't find the container with id 3b4082194aea9f284a808f49b5d30de10f4a49ffa323b1fe3f1adcf403ecb619 Apr 21 03:57:29.096577 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.096541 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz"] Apr 21 03:57:29.101095 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:57:29.101062 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b5ab6a5_9e4f_4ad4_bbfb_9156608256e5.slice/crio-b0ed8bf066df5e31dee7e5bc2a8db910944295d70d3a357a5124c0e1ee26f1cc WatchSource:0}: Error finding container b0ed8bf066df5e31dee7e5bc2a8db910944295d70d3a357a5124c0e1ee26f1cc: Status 404 returned error can't find the container with id b0ed8bf066df5e31dee7e5bc2a8db910944295d70d3a357a5124c0e1ee26f1cc Apr 21 03:57:29.111596 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.111572 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd"] Apr 21 03:57:29.115743 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:57:29.115720 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod762e77d2_f995_4355_832f_606b0cf4f9e6.slice/crio-90b6357d00940d880ff6cdb69ee61fbffd5393ed21a2a04f1a2092bacaf4d583 WatchSource:0}: Error finding container 90b6357d00940d880ff6cdb69ee61fbffd5393ed21a2a04f1a2092bacaf4d583: Status 404 returned error can't find the container with id 90b6357d00940d880ff6cdb69ee61fbffd5393ed21a2a04f1a2092bacaf4d583 Apr 21 03:57:29.119495 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.119259 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95"] Apr 21 03:57:29.123177 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:57:29.123148 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod888bac5f_e281_4267_ae08_fce73b1f965e.slice/crio-c238f62d92e9d583ccb3957028b5cf828d7a08ab52375e90e544d7167807855e WatchSource:0}: Error finding container c238f62d92e9d583ccb3957028b5cf828d7a08ab52375e90e544d7167807855e: Status 404 returned error can't find the container with id c238f62d92e9d583ccb3957028b5cf828d7a08ab52375e90e544d7167807855e Apr 21 03:57:29.177449 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.177421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" event={"ID":"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5","Type":"ContainerStarted","Data":"b0ed8bf066df5e31dee7e5bc2a8db910944295d70d3a357a5124c0e1ee26f1cc"} Apr 21 03:57:29.178356 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.178335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" event={"ID":"888bac5f-e281-4267-ae08-fce73b1f965e","Type":"ContainerStarted","Data":"c238f62d92e9d583ccb3957028b5cf828d7a08ab52375e90e544d7167807855e"} Apr 21 03:57:29.179419 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.179383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9cslm" event={"ID":"541c9c2c-7803-469b-9f76-fa3ec6995458","Type":"ContainerStarted","Data":"3b4082194aea9f284a808f49b5d30de10f4a49ffa323b1fe3f1adcf403ecb619"} Apr 21 03:57:29.180377 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.180356 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" event={"ID":"762e77d2-f995-4355-832f-606b0cf4f9e6","Type":"ContainerStarted","Data":"90b6357d00940d880ff6cdb69ee61fbffd5393ed21a2a04f1a2092bacaf4d583"} Apr 21 03:57:29.529268 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.529047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:29.529474 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.529302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:29.529474 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:29.529221 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:29.529474 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:29.529351 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f7f94fbd-b7gks: secret "image-registry-tls" not found Apr 21 03:57:29.529474 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:29.529419 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls podName:a3b6160a-423d-45d3-b52f-9933a1bfdbc6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:31.529398072 +0000 UTC m=+37.193988136 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls") pod "image-registry-64f7f94fbd-b7gks" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6") : secret "image-registry-tls" not found Apr 21 03:57:29.529474 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:29.529437 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:29.529749 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:29.529490 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls podName:a7fc9465-0576-4a91-ba4c-913400d12eb3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:31.529472856 +0000 UTC m=+37.194062937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls") pod "dns-default-hz28z" (UID: "a7fc9465-0576-4a91-ba4c-913400d12eb3") : secret "dns-default-metrics-tls" not found Apr 21 03:57:29.630796 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:29.630696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:29.630978 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:29.630871 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:29.630978 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:29.630941 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert podName:8b7373a9-03be-47d1-9a03-d92ce2e99f2a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:31.630921564 +0000 UTC m=+37.295511631 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert") pod "ingress-canary-6kgc9" (UID: "8b7373a9-03be-47d1-9a03-d92ce2e99f2a") : secret "canary-serving-cert" not found Apr 21 03:57:30.188819 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:30.188650 2575 generic.go:358] "Generic (PLEG): container finished" podID="aa5d6fee-c188-4cee-a9dd-cc90927bef31" containerID="4993be1495f45e7bd5353331b9788e66d328c4f5c46844fd8e1aacbf363cc87e" exitCode=0 Apr 21 03:57:30.188819 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:30.188718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerDied","Data":"4993be1495f45e7bd5353331b9788e66d328c4f5c46844fd8e1aacbf363cc87e"} Apr 21 03:57:31.197001 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:31.196107 2575 generic.go:358] "Generic (PLEG): container finished" podID="aa5d6fee-c188-4cee-a9dd-cc90927bef31" containerID="8866ec696cbfb31d309a2eeb7d9f1c5f3fbc4040325806d909c67d1f39425ff9" exitCode=0 Apr 21 03:57:31.197001 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:31.196189 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerDied","Data":"8866ec696cbfb31d309a2eeb7d9f1c5f3fbc4040325806d909c67d1f39425ff9"} Apr 21 03:57:31.550405 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:31.550370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:31.550560 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:31.550441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:31.550652 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:31.550632 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:31.550708 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:31.550702 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls podName:a7fc9465-0576-4a91-ba4c-913400d12eb3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:35.550682054 +0000 UTC m=+41.215272125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls") pod "dns-default-hz28z" (UID: "a7fc9465-0576-4a91-ba4c-913400d12eb3") : secret "dns-default-metrics-tls" not found Apr 21 03:57:31.551158 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:31.551139 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:31.551253 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:31.551160 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f7f94fbd-b7gks: secret "image-registry-tls" not found Apr 21 03:57:31.551253 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:31.551209 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls podName:a3b6160a-423d-45d3-b52f-9933a1bfdbc6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:35.551192603 +0000 UTC m=+41.215782683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls") pod "image-registry-64f7f94fbd-b7gks" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6") : secret "image-registry-tls" not found Apr 21 03:57:31.651895 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:31.651862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:31.652041 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:31.652018 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:31.652109 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:31.652087 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert podName:8b7373a9-03be-47d1-9a03-d92ce2e99f2a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:35.652072593 +0000 UTC m=+41.316662657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert") pod "ingress-canary-6kgc9" (UID: "8b7373a9-03be-47d1-9a03-d92ce2e99f2a") : secret "canary-serving-cert" not found Apr 21 03:57:31.956501 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:31.956057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:31.966569 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:31.966513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3-original-pull-secret\") pod \"global-pull-secret-syncer-lc86l\" (UID: \"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3\") " pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:32.243413 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:32.243340 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lc86l" Apr 21 03:57:35.579939 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:35.579904 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:35.580476 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:35.579969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:35.580476 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:35.580092 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:35.580476 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:35.580142 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:35.580476 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:35.580160 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls podName:a7fc9465-0576-4a91-ba4c-913400d12eb3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:43.580144514 +0000 UTC m=+49.244734578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls") pod "dns-default-hz28z" (UID: "a7fc9465-0576-4a91-ba4c-913400d12eb3") : secret "dns-default-metrics-tls" not found Apr 21 03:57:35.580476 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:35.580162 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f7f94fbd-b7gks: secret "image-registry-tls" not found Apr 21 03:57:35.580476 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:35.580216 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls podName:a3b6160a-423d-45d3-b52f-9933a1bfdbc6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:43.580199681 +0000 UTC m=+49.244789746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls") pod "image-registry-64f7f94fbd-b7gks" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6") : secret "image-registry-tls" not found Apr 21 03:57:35.680739 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:35.680696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:35.680940 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:35.680878 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:35.681012 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:35.680945 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert podName:8b7373a9-03be-47d1-9a03-d92ce2e99f2a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:43.680925447 +0000 UTC m=+49.345515526 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert") pod "ingress-canary-6kgc9" (UID: "8b7373a9-03be-47d1-9a03-d92ce2e99f2a") : secret "canary-serving-cert" not found Apr 21 03:57:36.576209 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:36.576114 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lc86l"] Apr 21 03:57:36.799078 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:57:36.799050 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b00db34_9aa2_4d7a_82d3_34c2ded3bcd3.slice/crio-acb86962822fd07fe1f6f5fca0f812cf7c78d6d94182ce7367eba6490ca5ae3a WatchSource:0}: Error finding container acb86962822fd07fe1f6f5fca0f812cf7c78d6d94182ce7367eba6490ca5ae3a: Status 404 returned error can't find the container with id acb86962822fd07fe1f6f5fca0f812cf7c78d6d94182ce7367eba6490ca5ae3a Apr 21 03:57:37.210261 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.210227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9cslm" event={"ID":"541c9c2c-7803-469b-9f76-fa3ec6995458","Type":"ContainerStarted","Data":"726acd15ff20b7e87be799fb19ab8bbcd2d8a76733e5ed32e2d88c7757b431ab"} Apr 21 03:57:37.210518 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.210315 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:57:37.213201 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.213180 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-phc24" event={"ID":"aa5d6fee-c188-4cee-a9dd-cc90927bef31","Type":"ContainerStarted","Data":"6046f06fe735fa22e3cf9ad60a0ad21184ba96073aec1077868c57e6c9ff16f7"} Apr 21 03:57:37.214473 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.214454 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" event={"ID":"762e77d2-f995-4355-832f-606b0cf4f9e6","Type":"ContainerStarted","Data":"ef45f3f449443f02e5137e88623fb004c9314f502d321ae88f6b45637f5e811d"} Apr 21 03:57:37.215443 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.215424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lc86l" event={"ID":"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3","Type":"ContainerStarted","Data":"acb86962822fd07fe1f6f5fca0f812cf7c78d6d94182ce7367eba6490ca5ae3a"} Apr 21 03:57:37.216535 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.216515 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" event={"ID":"4b5ab6a5-9e4f-4ad4-bbfb-9156608256e5","Type":"ContainerStarted","Data":"ca7e8a31707e40f18e9c6c1ec8b1680fefdfffb3748580c537ffb09246baa5da"} Apr 21 03:57:37.216715 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.216689 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:37.217818 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.217787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" event={"ID":"888bac5f-e281-4267-ae08-fce73b1f965e","Type":"ContainerStarted","Data":"d32a1dc9782c382bfceb349fe9e90708c65a13132f9b4f3e32eefec3c5d475d3"} Apr 21 03:57:37.218313 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.218300 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" Apr 21 03:57:37.226628 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.226592 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9cslm" podStartSLOduration=34.711650839 podStartE2EDuration="42.226577395s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:57:28.942053147 +0000 UTC m=+34.606643225" lastFinishedPulling="2026-04-21 03:57:36.456979718 +0000 UTC m=+42.121569781" observedRunningTime="2026-04-21 03:57:37.226455759 +0000 UTC m=+42.891045844" watchObservedRunningTime="2026-04-21 03:57:37.226577395 +0000 UTC m=+42.891167482" Apr 21 03:57:37.248700 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.248656 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-phc24" podStartSLOduration=10.957258664 podStartE2EDuration="42.248647057s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:56:57.677437924 +0000 UTC m=+3.342027988" lastFinishedPulling="2026-04-21 03:57:28.968826304 +0000 UTC m=+34.633416381" observedRunningTime="2026-04-21 03:57:37.247999617 +0000 UTC m=+42.912589704" watchObservedRunningTime="2026-04-21 03:57:37.248647057 +0000 UTC m=+42.913237143" Apr 21 03:57:37.263564 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.263530 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7485454c8-7drvz" podStartSLOduration=1.535306416 podStartE2EDuration="9.263519921s" podCreationTimestamp="2026-04-21 03:57:28 +0000 UTC" firstStartedPulling="2026-04-21 03:57:29.106617852 +0000 UTC m=+34.771207919" lastFinishedPulling="2026-04-21 03:57:36.834831356 +0000 UTC m=+42.499421424" observedRunningTime="2026-04-21 03:57:37.263032713 +0000 UTC m=+42.927622800" watchObservedRunningTime="2026-04-21 03:57:37.263519921 +0000 UTC m=+42.928110006" Apr 21 03:57:37.280533 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:37.280500 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67d4f68f6f-54jxd" podStartSLOduration=1.954396906 podStartE2EDuration="9.280492226s" podCreationTimestamp="2026-04-21 03:57:28 +0000 UTC" firstStartedPulling="2026-04-21 03:57:29.119493575 +0000 UTC m=+34.784083646" lastFinishedPulling="2026-04-21 03:57:36.445588899 +0000 UTC m=+42.110178966" observedRunningTime="2026-04-21 03:57:37.279688738 +0000 UTC m=+42.944278823" watchObservedRunningTime="2026-04-21 03:57:37.280492226 +0000 UTC m=+42.945082303" Apr 21 03:57:40.226582 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:40.226542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" event={"ID":"888bac5f-e281-4267-ae08-fce73b1f965e","Type":"ContainerStarted","Data":"a99f795a44659d090b84b5035a811187ee56175b43d7bc60824826e346b14dea"} Apr 21 03:57:40.226582 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:40.226580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" event={"ID":"888bac5f-e281-4267-ae08-fce73b1f965e","Type":"ContainerStarted","Data":"42712524dfebd40f64726947eaa363c7c48f372b756ceb4f5b238a08aa7c7025"} Apr 21 03:57:40.244457 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:40.244407 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" podStartSLOduration=1.952161055 podStartE2EDuration="12.244392262s" podCreationTimestamp="2026-04-21 03:57:28 +0000 UTC" firstStartedPulling="2026-04-21 03:57:29.125510394 +0000 UTC m=+34.790100463" lastFinishedPulling="2026-04-21 03:57:39.417741603 +0000 UTC m=+45.082331670" observedRunningTime="2026-04-21 03:57:40.244073778 +0000 UTC m=+45.908663864" watchObservedRunningTime="2026-04-21 03:57:40.244392262 +0000 UTC m=+45.908982350" Apr 21 03:57:43.236559 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:43.236515 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lc86l" event={"ID":"9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3","Type":"ContainerStarted","Data":"109c192b0823f887148ed7847becdd37518b60b059931b5f5438f70dbce070e9"} Apr 21 03:57:43.252412 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:43.252366 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lc86l" podStartSLOduration=37.537787472 podStartE2EDuration="43.252352123s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:57:36.818497938 +0000 UTC m=+42.483088002" lastFinishedPulling="2026-04-21 03:57:42.533062574 +0000 UTC m=+48.197652653" observedRunningTime="2026-04-21 03:57:43.251509674 +0000 UTC m=+48.916099762" watchObservedRunningTime="2026-04-21 03:57:43.252352123 +0000 UTC m=+48.916942208" Apr 21 03:57:43.642372 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:43.642344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:43.642518 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:43.642412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:43.642518 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:43.642495 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:43.642518 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:43.642495 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:43.642619 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:43.642564 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls podName:a7fc9465-0576-4a91-ba4c-913400d12eb3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:59.642549088 +0000 UTC m=+65.307139152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls") pod "dns-default-hz28z" (UID: "a7fc9465-0576-4a91-ba4c-913400d12eb3") : secret "dns-default-metrics-tls" not found Apr 21 03:57:43.642619 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:43.642506 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f7f94fbd-b7gks: secret "image-registry-tls" not found Apr 21 03:57:43.642700 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:43.642621 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls podName:a3b6160a-423d-45d3-b52f-9933a1bfdbc6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:59.642610203 +0000 UTC m=+65.307200267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls") pod "image-registry-64f7f94fbd-b7gks" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6") : secret "image-registry-tls" not found Apr 21 03:57:43.743014 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:43.742986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:43.743133 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:43.743093 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:43.743170 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:43.743143 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert podName:8b7373a9-03be-47d1-9a03-d92ce2e99f2a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:59.743126572 +0000 UTC m=+65.407716637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert") pod "ingress-canary-6kgc9" (UID: "8b7373a9-03be-47d1-9a03-d92ce2e99f2a") : secret "canary-serving-cert" not found Apr 21 03:57:52.181371 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:52.181340 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxcbd" Apr 21 03:57:56.346471 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.346381 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx"] Apr 21 03:57:56.351153 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.351130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" Apr 21 03:57:56.354347 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.354328 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-wmqkj\"" Apr 21 03:57:56.362793 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.362750 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx"] Apr 21 03:57:56.433958 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.433933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npm8x\" (UniqueName: \"kubernetes.io/projected/64ecb179-9e59-428a-8a4d-a5bfdc94ac99-kube-api-access-npm8x\") pod \"network-check-source-8894fc9bd-tvjbx\" (UID: \"64ecb179-9e59-428a-8a4d-a5bfdc94ac99\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" Apr 21 03:57:56.534550 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.534521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npm8x\" (UniqueName: \"kubernetes.io/projected/64ecb179-9e59-428a-8a4d-a5bfdc94ac99-kube-api-access-npm8x\") pod \"network-check-source-8894fc9bd-tvjbx\" (UID: \"64ecb179-9e59-428a-8a4d-a5bfdc94ac99\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" Apr 21 03:57:56.549019 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.549002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npm8x\" (UniqueName: \"kubernetes.io/projected/64ecb179-9e59-428a-8a4d-a5bfdc94ac99-kube-api-access-npm8x\") pod \"network-check-source-8894fc9bd-tvjbx\" (UID: \"64ecb179-9e59-428a-8a4d-a5bfdc94ac99\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" Apr 21 03:57:56.659494 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.659473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" Apr 21 03:57:56.771264 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:56.771234 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx"] Apr 21 03:57:56.774270 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:57:56.774242 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ecb179_9e59_428a_8a4d_a5bfdc94ac99.slice/crio-7dc2f9f5e71470c124fa626509bd08614a9910e958e14316dd69760db3ea8183 WatchSource:0}: Error finding container 7dc2f9f5e71470c124fa626509bd08614a9910e958e14316dd69760db3ea8183: Status 404 returned error can't find the container with id 7dc2f9f5e71470c124fa626509bd08614a9910e958e14316dd69760db3ea8183 Apr 21 03:57:57.272394 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:57.272361 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" event={"ID":"64ecb179-9e59-428a-8a4d-a5bfdc94ac99","Type":"ContainerStarted","Data":"5e2500b8e1cda2508615162f4b21ec6689ce120365bccf129175027da9f514a2"} Apr 21 03:57:57.272394 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:57.272395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" event={"ID":"64ecb179-9e59-428a-8a4d-a5bfdc94ac99","Type":"ContainerStarted","Data":"7dc2f9f5e71470c124fa626509bd08614a9910e958e14316dd69760db3ea8183"} Apr 21 03:57:57.290484 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:57.290445 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tvjbx" podStartSLOduration=1.29043335 podStartE2EDuration="1.29043335s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:57.289538123 +0000 UTC m=+62.954128210" watchObservedRunningTime="2026-04-21 03:57:57.29043335 +0000 UTC m=+62.955023436" Apr 21 03:57:59.656388 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:59.656354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:57:59.656848 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:59.656397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:57:59.656848 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:59.656498 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:59.656848 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:59.656553 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls podName:a7fc9465-0576-4a91-ba4c-913400d12eb3 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.65653906 +0000 UTC m=+97.321129130 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls") pod "dns-default-hz28z" (UID: "a7fc9465-0576-4a91-ba4c-913400d12eb3") : secret "dns-default-metrics-tls" not found Apr 21 03:57:59.656848 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:59.656497 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:59.656848 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:59.656594 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f7f94fbd-b7gks: secret "image-registry-tls" not found Apr 21 03:57:59.656848 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:59.656651 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls podName:a3b6160a-423d-45d3-b52f-9933a1bfdbc6 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.656634207 +0000 UTC m=+97.321224291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls") pod "image-registry-64f7f94fbd-b7gks" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6") : secret "image-registry-tls" not found Apr 21 03:57:59.757059 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:57:59.757035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:57:59.757173 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:59.757132 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:59.757173 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:57:59.757170 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert podName:8b7373a9-03be-47d1-9a03-d92ce2e99f2a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.757159771 +0000 UTC m=+97.421749835 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert") pod "ingress-canary-6kgc9" (UID: "8b7373a9-03be-47d1-9a03-d92ce2e99f2a") : secret "canary-serving-cert" not found Apr 21 03:58:00.663931 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:00.663902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:58:00.664295 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:58:00.664046 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:58:00.664295 ip-10-0-134-15 kubenswrapper[2575]: E0421 03:58:00.664107 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs podName:a42d1f99-c5af-45d0-9ce8-8affe0d01ea4 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:04.66409196 +0000 UTC m=+130.328682023 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs") pod "network-metrics-daemon-gpfrt" (UID: "a42d1f99-c5af-45d0-9ce8-8affe0d01ea4") : secret "metrics-daemon-secret" not found Apr 21 03:58:02.534950 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:02.534918 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j8nd9_89bfd36f-311d-477d-a1ba-cc9a2854b55d/dns-node-resolver/0.log" Apr 21 03:58:03.939423 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:03.939389 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tqkpr_48670ed3-db9b-4299-a9b9-270bf4c32561/node-ca/0.log" Apr 21 03:58:08.223035 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:08.222997 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9cslm" Apr 21 03:58:23.611203 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.611075 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-s9nkw"] Apr 21 03:58:23.616217 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.616194 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.619617 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.619599 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 03:58:23.620954 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.620931 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 03:58:23.621046 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.620991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qz42s\"" Apr 21 03:58:23.621046 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.621016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 03:58:23.621211 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.621195 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 03:58:23.636146 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.636125 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s9nkw"] Apr 21 03:58:23.715421 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.715397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bda59d05-4057-4aab-ae91-a860d3e62ba1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.715527 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.715423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bda59d05-4057-4aab-ae91-a860d3e62ba1-data-volume\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.715527 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.715456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bda59d05-4057-4aab-ae91-a860d3e62ba1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.715643 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.715621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgnwj\" (UniqueName: \"kubernetes.io/projected/bda59d05-4057-4aab-ae91-a860d3e62ba1-kube-api-access-qgnwj\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.715713 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.715697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bda59d05-4057-4aab-ae91-a860d3e62ba1-crio-socket\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816191 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgnwj\" (UniqueName: \"kubernetes.io/projected/bda59d05-4057-4aab-ae91-a860d3e62ba1-kube-api-access-qgnwj\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816283 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bda59d05-4057-4aab-ae91-a860d3e62ba1-crio-socket\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816345 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bda59d05-4057-4aab-ae91-a860d3e62ba1-crio-socket\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816385 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bda59d05-4057-4aab-ae91-a860d3e62ba1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816424 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bda59d05-4057-4aab-ae91-a860d3e62ba1-data-volume\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816462 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bda59d05-4057-4aab-ae91-a860d3e62ba1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816733 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bda59d05-4057-4aab-ae91-a860d3e62ba1-data-volume\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.816937 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.816922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bda59d05-4057-4aab-ae91-a860d3e62ba1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.818553 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.818534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bda59d05-4057-4aab-ae91-a860d3e62ba1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.825529 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.825509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgnwj\" (UniqueName: \"kubernetes.io/projected/bda59d05-4057-4aab-ae91-a860d3e62ba1-kube-api-access-qgnwj\") pod \"insights-runtime-extractor-s9nkw\" (UID: \"bda59d05-4057-4aab-ae91-a860d3e62ba1\") " pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:23.924435 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:23.924413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s9nkw" Apr 21 03:58:24.049265 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:24.049241 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s9nkw"] Apr 21 03:58:24.052387 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:58:24.052361 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda59d05_4057_4aab_ae91_a860d3e62ba1.slice/crio-7bcd0d4452e52544d83b5393ca3e7a86a1d4a68348a13011170775cdee53946f WatchSource:0}: Error finding container 7bcd0d4452e52544d83b5393ca3e7a86a1d4a68348a13011170775cdee53946f: Status 404 returned error can't find the container with id 7bcd0d4452e52544d83b5393ca3e7a86a1d4a68348a13011170775cdee53946f Apr 21 03:58:24.337914 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:24.337830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s9nkw" event={"ID":"bda59d05-4057-4aab-ae91-a860d3e62ba1","Type":"ContainerStarted","Data":"460347ca3534117ecb74d75ed8f161df52fe36bb008eff9c19a34821dc4fc23f"} Apr 21 03:58:24.337914 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:24.337868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s9nkw" event={"ID":"bda59d05-4057-4aab-ae91-a860d3e62ba1","Type":"ContainerStarted","Data":"7bcd0d4452e52544d83b5393ca3e7a86a1d4a68348a13011170775cdee53946f"} Apr 21 03:58:25.342453 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:25.342416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s9nkw" event={"ID":"bda59d05-4057-4aab-ae91-a860d3e62ba1","Type":"ContainerStarted","Data":"58521b331363398bd3d484e32dc83d66c8a3c4b7f5462c260152aebe02fb1028"} Apr 21 03:58:26.346503 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:26.346468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s9nkw" event={"ID":"bda59d05-4057-4aab-ae91-a860d3e62ba1","Type":"ContainerStarted","Data":"5d2be3f6d13a98caf0eb83fa1dab7f6fbdf93c9bf88c3f9f9be2503c0a3f4e5d"} Apr 21 03:58:26.364276 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:26.364167 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-s9nkw" podStartSLOduration=1.6050549219999999 podStartE2EDuration="3.364154051s" podCreationTimestamp="2026-04-21 03:58:23 +0000 UTC" firstStartedPulling="2026-04-21 03:58:24.105436343 +0000 UTC m=+89.770026410" lastFinishedPulling="2026-04-21 03:58:25.864535476 +0000 UTC m=+91.529125539" observedRunningTime="2026-04-21 03:58:26.363917296 +0000 UTC m=+92.028507385" watchObservedRunningTime="2026-04-21 03:58:26.364154051 +0000 UTC m=+92.028744137" Apr 21 03:58:31.677240 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.677198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:58:31.677240 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.677245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:58:31.679606 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.679578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7fc9465-0576-4a91-ba4c-913400d12eb3-metrics-tls\") pod \"dns-default-hz28z\" (UID: \"a7fc9465-0576-4a91-ba4c-913400d12eb3\") " pod="openshift-dns/dns-default-hz28z" Apr 21 03:58:31.679606 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.679596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"image-registry-64f7f94fbd-b7gks\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:58:31.777664 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.777640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:58:31.779711 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.779691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b7373a9-03be-47d1-9a03-d92ce2e99f2a-cert\") pod \"ingress-canary-6kgc9\" (UID: \"8b7373a9-03be-47d1-9a03-d92ce2e99f2a\") " pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:58:31.960375 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.960354 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ntv6f\"" Apr 21 03:58:31.968660 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.968640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:58:31.973712 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.973695 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lrpqk\"" Apr 21 03:58:31.981077 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:31.981052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hz28z" Apr 21 03:58:32.062266 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.062237 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4vtdx\"" Apr 21 03:58:32.068129 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.068106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6kgc9" Apr 21 03:58:32.103055 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:58:32.103024 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b6160a_423d_45d3_b52f_9933a1bfdbc6.slice/crio-83985a9b081f99cc905ff4bd9d1daaf588f3ee020f83813eca9619372e08aade WatchSource:0}: Error finding container 83985a9b081f99cc905ff4bd9d1daaf588f3ee020f83813eca9619372e08aade: Status 404 returned error can't find the container with id 83985a9b081f99cc905ff4bd9d1daaf588f3ee020f83813eca9619372e08aade Apr 21 03:58:32.103357 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.103338 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64f7f94fbd-b7gks"] Apr 21 03:58:32.117007 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.116708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hz28z"] Apr 21 03:58:32.119729 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:58:32.119701 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7fc9465_0576_4a91_ba4c_913400d12eb3.slice/crio-1afbb0f7caad382ec5750860cfa59ddbb161af2c47242401fed7937d71e6647b WatchSource:0}: Error finding container 1afbb0f7caad382ec5750860cfa59ddbb161af2c47242401fed7937d71e6647b: Status 404 returned error can't find the container with id 1afbb0f7caad382ec5750860cfa59ddbb161af2c47242401fed7937d71e6647b Apr 21 03:58:32.186282 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.186257 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6kgc9"] Apr 21 03:58:32.188870 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:58:32.188847 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7373a9_03be_47d1_9a03_d92ce2e99f2a.slice/crio-b457eac2980048f65da907ca9a4fef3aa8ed363a258b148ed5b13f55f11b8235 WatchSource:0}: Error finding container b457eac2980048f65da907ca9a4fef3aa8ed363a258b148ed5b13f55f11b8235: Status 404 returned error can't find the container with id b457eac2980048f65da907ca9a4fef3aa8ed363a258b148ed5b13f55f11b8235 Apr 21 03:58:32.364356 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.364265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hz28z" event={"ID":"a7fc9465-0576-4a91-ba4c-913400d12eb3","Type":"ContainerStarted","Data":"1afbb0f7caad382ec5750860cfa59ddbb161af2c47242401fed7937d71e6647b"} Apr 21 03:58:32.365569 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.365541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" event={"ID":"a3b6160a-423d-45d3-b52f-9933a1bfdbc6","Type":"ContainerStarted","Data":"c2a2b752f5acbd38150fa1c1a7e6109476ffd524bbfd7bb3c8b283e490b72b14"} Apr 21 03:58:32.365683 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.365575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" event={"ID":"a3b6160a-423d-45d3-b52f-9933a1bfdbc6","Type":"ContainerStarted","Data":"83985a9b081f99cc905ff4bd9d1daaf588f3ee020f83813eca9619372e08aade"} Apr 21 03:58:32.365683 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.365672 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:58:32.366565 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.366539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6kgc9" event={"ID":"8b7373a9-03be-47d1-9a03-d92ce2e99f2a","Type":"ContainerStarted","Data":"b457eac2980048f65da907ca9a4fef3aa8ed363a258b148ed5b13f55f11b8235"} Apr 21 03:58:32.386815 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:32.386772 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" podStartSLOduration=97.386742958 podStartE2EDuration="1m37.386742958s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:32.385611372 +0000 UTC m=+98.050201455" watchObservedRunningTime="2026-04-21 03:58:32.386742958 +0000 UTC m=+98.051333043" Apr 21 03:58:34.373172 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:34.373139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hz28z" event={"ID":"a7fc9465-0576-4a91-ba4c-913400d12eb3","Type":"ContainerStarted","Data":"4e8225b462129ebde046d6a91ff909221863237191f622649033b49bdba2a111"} Apr 21 03:58:34.373172 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:34.373175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hz28z" event={"ID":"a7fc9465-0576-4a91-ba4c-913400d12eb3","Type":"ContainerStarted","Data":"b9f910a91367621612dee1b6c843a4ed76a600ffaab81212b97ff65e9e623b51"} Apr 21 03:58:34.373658 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:34.373265 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hz28z" Apr 21 03:58:34.374498 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:34.374478 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6kgc9" event={"ID":"8b7373a9-03be-47d1-9a03-d92ce2e99f2a","Type":"ContainerStarted","Data":"edf182d46b2d3b095dbd2e904378d04df152f0f5c5ca597669a5e1345141978d"} Apr 21 03:58:34.390448 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:34.390405 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hz28z" podStartSLOduration=65.723444652 podStartE2EDuration="1m7.390393501s" podCreationTimestamp="2026-04-21 03:57:27 +0000 UTC" firstStartedPulling="2026-04-21 03:58:32.122189726 +0000 UTC m=+97.786779794" lastFinishedPulling="2026-04-21 03:58:33.789138566 +0000 UTC m=+99.453728643" observedRunningTime="2026-04-21 03:58:34.388773566 +0000 UTC m=+100.053363644" watchObservedRunningTime="2026-04-21 03:58:34.390393501 +0000 UTC m=+100.054983586" Apr 21 03:58:34.403266 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:34.403231 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6kgc9" podStartSLOduration=65.801565859 podStartE2EDuration="1m7.403220411s" podCreationTimestamp="2026-04-21 03:57:27 +0000 UTC" firstStartedPulling="2026-04-21 03:58:32.190617348 +0000 UTC m=+97.855207413" lastFinishedPulling="2026-04-21 03:58:33.792271901 +0000 UTC m=+99.456861965" observedRunningTime="2026-04-21 03:58:34.402346493 +0000 UTC m=+100.066936578" watchObservedRunningTime="2026-04-21 03:58:34.403220411 +0000 UTC m=+100.067810536" Apr 21 03:58:40.151336 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.151303 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4hlnc"] Apr 21 03:58:40.156300 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.156280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.159028 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.158991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pnzbl\"" Apr 21 03:58:40.159028 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.158998 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 03:58:40.159216 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.159158 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 03:58:40.160142 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.160120 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 03:58:40.160268 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.160201 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 03:58:40.160365 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.160344 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 03:58:40.160428 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.160410 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 03:58:40.242016 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.241990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-root\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242016 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-sys\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242159 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-tls\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242159 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39d1ef11-133c-4565-a940-89e38405f4e3-metrics-client-ca\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242159 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-accelerators-collector-config\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242256 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94pk\" (UniqueName: \"kubernetes.io/projected/39d1ef11-133c-4565-a940-89e38405f4e3-kube-api-access-h94pk\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242256 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242256 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-textfile\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.242256 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.242243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-wtmp\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343178 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343178 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-textfile\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343310 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-wtmp\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343310 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-root\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343310 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-sys\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343310 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-tls\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343310 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39d1ef11-133c-4565-a940-89e38405f4e3-metrics-client-ca\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343547 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-wtmp\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343547 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-sys\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343547 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/39d1ef11-133c-4565-a940-89e38405f4e3-root\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343547 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-accelerators-collector-config\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343547 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h94pk\" (UniqueName: \"kubernetes.io/projected/39d1ef11-133c-4565-a940-89e38405f4e3-kube-api-access-h94pk\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343547 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-textfile\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343847 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39d1ef11-133c-4565-a940-89e38405f4e3-metrics-client-ca\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.343915 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.343893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-accelerators-collector-config\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.345874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.345847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-tls\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.345874 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.345857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39d1ef11-133c-4565-a940-89e38405f4e3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.351051 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.351031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94pk\" (UniqueName: \"kubernetes.io/projected/39d1ef11-133c-4565-a940-89e38405f4e3-kube-api-access-h94pk\") pod \"node-exporter-4hlnc\" (UID: \"39d1ef11-133c-4565-a940-89e38405f4e3\") " pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.466287 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:40.466221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4hlnc" Apr 21 03:58:40.475831 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:58:40.475802 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d1ef11_133c_4565_a940_89e38405f4e3.slice/crio-1532453b9a853382be9817951624502fed82fc07215d5a344c15103da09bd323 WatchSource:0}: Error finding container 1532453b9a853382be9817951624502fed82fc07215d5a344c15103da09bd323: Status 404 returned error can't find the container with id 1532453b9a853382be9817951624502fed82fc07215d5a344c15103da09bd323 Apr 21 03:58:41.392920 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:41.392894 2575 generic.go:358] "Generic (PLEG): container finished" podID="39d1ef11-133c-4565-a940-89e38405f4e3" containerID="94827a94d8c987100459a8b9350b66805beb84fa74574d67c5c713d50042b161" exitCode=0 Apr 21 03:58:41.393207 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:41.392965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hlnc" event={"ID":"39d1ef11-133c-4565-a940-89e38405f4e3","Type":"ContainerDied","Data":"94827a94d8c987100459a8b9350b66805beb84fa74574d67c5c713d50042b161"} Apr 21 03:58:41.393207 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:41.392998 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hlnc" event={"ID":"39d1ef11-133c-4565-a940-89e38405f4e3","Type":"ContainerStarted","Data":"1532453b9a853382be9817951624502fed82fc07215d5a344c15103da09bd323"} Apr 21 03:58:42.398204 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:42.398170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hlnc" event={"ID":"39d1ef11-133c-4565-a940-89e38405f4e3","Type":"ContainerStarted","Data":"b0aac412bc8dc923c8ca16f95febe09919ced14b7b178a37c3e1309f13de064a"} Apr 21 03:58:42.398204 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:42.398212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hlnc" event={"ID":"39d1ef11-133c-4565-a940-89e38405f4e3","Type":"ContainerStarted","Data":"4e772da8ad4271d822a5532e43bbd8ce8cb360e888efeb57ee9867ed541f9613"} Apr 21 03:58:44.379319 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:44.379272 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hz28z" Apr 21 03:58:44.397259 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:44.397216 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4hlnc" podStartSLOduration=3.755706314 podStartE2EDuration="4.39720052s" podCreationTimestamp="2026-04-21 03:58:40 +0000 UTC" firstStartedPulling="2026-04-21 03:58:40.477564295 +0000 UTC m=+106.142154359" lastFinishedPulling="2026-04-21 03:58:41.119058496 +0000 UTC m=+106.783648565" observedRunningTime="2026-04-21 03:58:42.419599269 +0000 UTC m=+108.084189367" watchObservedRunningTime="2026-04-21 03:58:44.39720052 +0000 UTC m=+110.061790606" Apr 21 03:58:45.258568 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:45.258532 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64f7f94fbd-b7gks"] Apr 21 03:58:55.263808 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:55.263780 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:58:58.860184 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:58:58.860126 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" podUID="888bac5f-e281-4267-ae08-fce73b1f965e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 03:59:04.710811 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:04.710775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:59:04.713015 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:04.712995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d1f99-c5af-45d0-9ce8-8affe0d01ea4-metrics-certs\") pod \"network-metrics-daemon-gpfrt\" (UID: \"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4\") " pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:59:04.941828 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:04.941802 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rcf6j\"" Apr 21 03:59:04.947235 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:04.947213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpfrt" Apr 21 03:59:05.058529 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:05.058466 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gpfrt"] Apr 21 03:59:05.060776 ip-10-0-134-15 kubenswrapper[2575]: W0421 03:59:05.060734 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42d1f99_c5af_45d0_9ce8_8affe0d01ea4.slice/crio-26994ecdc361cea7d5a1f07433b9a112cfd9b2a3c37c6fb64313003b4315fcc5 WatchSource:0}: Error finding container 26994ecdc361cea7d5a1f07433b9a112cfd9b2a3c37c6fb64313003b4315fcc5: Status 404 returned error can't find the container with id 26994ecdc361cea7d5a1f07433b9a112cfd9b2a3c37c6fb64313003b4315fcc5 Apr 21 03:59:05.454367 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:05.454331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gpfrt" event={"ID":"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4","Type":"ContainerStarted","Data":"26994ecdc361cea7d5a1f07433b9a112cfd9b2a3c37c6fb64313003b4315fcc5"} Apr 21 03:59:06.459312 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:06.459279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gpfrt" event={"ID":"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4","Type":"ContainerStarted","Data":"53e258c461fba8692415a3cf2615d37d4c4b4b9ddec37c14e463d815203187b3"} Apr 21 03:59:06.459312 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:06.459314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gpfrt" event={"ID":"a42d1f99-c5af-45d0-9ce8-8affe0d01ea4","Type":"ContainerStarted","Data":"3b6905e087246b96d10a11448dd8ae1a8616c5c37432cbf265f079a655196742"} Apr 21 03:59:06.473735 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:06.473688 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gpfrt" podStartSLOduration=130.524855038 podStartE2EDuration="2m11.473673761s" podCreationTimestamp="2026-04-21 03:56:55 +0000 UTC" firstStartedPulling="2026-04-21 03:59:05.065050599 +0000 UTC m=+130.729640663" lastFinishedPulling="2026-04-21 03:59:06.013869315 +0000 UTC m=+131.678459386" observedRunningTime="2026-04-21 03:59:06.472946296 +0000 UTC m=+132.137536383" watchObservedRunningTime="2026-04-21 03:59:06.473673761 +0000 UTC m=+132.138263847" Apr 21 03:59:08.859448 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:08.859412 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" podUID="888bac5f-e281-4267-ae08-fce73b1f965e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 03:59:10.276747 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.276703 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" podUID="a3b6160a-423d-45d3-b52f-9933a1bfdbc6" containerName="registry" containerID="cri-o://c2a2b752f5acbd38150fa1c1a7e6109476ffd524bbfd7bb3c8b283e490b72b14" gracePeriod=30 Apr 21 03:59:10.471807 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.471774 2575 generic.go:358] "Generic (PLEG): container finished" podID="a3b6160a-423d-45d3-b52f-9933a1bfdbc6" containerID="c2a2b752f5acbd38150fa1c1a7e6109476ffd524bbfd7bb3c8b283e490b72b14" exitCode=0 Apr 21 03:59:10.471956 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.471822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" event={"ID":"a3b6160a-423d-45d3-b52f-9933a1bfdbc6","Type":"ContainerDied","Data":"c2a2b752f5acbd38150fa1c1a7e6109476ffd524bbfd7bb3c8b283e490b72b14"} Apr 21 03:59:10.503243 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.503225 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:59:10.649958 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.649936 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-ca-trust-extracted\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.650098 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.649975 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srt7v\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-kube-api-access-srt7v\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.650098 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.650011 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-image-registry-private-configuration\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.650919 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.650340 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-installation-pull-secrets\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.650919 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.650601 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-trusted-ca\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.650919 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.650646 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-bound-sa-token\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.650919 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.650682 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.650919 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.650718 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-certificates\") pod \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\" (UID: \"a3b6160a-423d-45d3-b52f-9933a1bfdbc6\") " Apr 21 03:59:10.651716 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.651609 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 03:59:10.653674 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.652962 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 03:59:10.657133 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.657100 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 03:59:10.658572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.658326 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 03:59:10.658572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.658134 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-kube-api-access-srt7v" (OuterVolumeSpecName: "kube-api-access-srt7v") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "kube-api-access-srt7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 03:59:10.658572 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.658409 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 03:59:10.659165 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.659141 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 03:59:10.661484 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.661457 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a3b6160a-423d-45d3-b52f-9933a1bfdbc6" (UID: "a3b6160a-423d-45d3-b52f-9933a1bfdbc6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 03:59:10.753143 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753108 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-installation-pull-secrets\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:10.753143 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753139 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-trusted-ca\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:10.753143 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753149 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-bound-sa-token\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:10.753347 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753158 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:10.753347 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753166 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-registry-certificates\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:10.753347 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753175 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-ca-trust-extracted\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:10.753347 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753183 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srt7v\" (UniqueName: \"kubernetes.io/projected/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-kube-api-access-srt7v\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:10.753347 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:10.753194 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3b6160a-423d-45d3-b52f-9933a1bfdbc6-image-registry-private-configuration\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 03:59:11.475144 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:11.475109 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" event={"ID":"a3b6160a-423d-45d3-b52f-9933a1bfdbc6","Type":"ContainerDied","Data":"83985a9b081f99cc905ff4bd9d1daaf588f3ee020f83813eca9619372e08aade"} Apr 21 03:59:11.475144 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:11.475136 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f7f94fbd-b7gks" Apr 21 03:59:11.475595 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:11.475158 2575 scope.go:117] "RemoveContainer" containerID="c2a2b752f5acbd38150fa1c1a7e6109476ffd524bbfd7bb3c8b283e490b72b14" Apr 21 03:59:11.500136 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:11.500109 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64f7f94fbd-b7gks"] Apr 21 03:59:11.507346 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:11.507323 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-64f7f94fbd-b7gks"] Apr 21 03:59:13.027442 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:13.027411 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b6160a-423d-45d3-b52f-9933a1bfdbc6" path="/var/lib/kubelet/pods/a3b6160a-423d-45d3-b52f-9933a1bfdbc6/volumes" Apr 21 03:59:18.860081 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:18.860037 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" podUID="888bac5f-e281-4267-ae08-fce73b1f965e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 03:59:18.860444 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:18.860109 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" Apr 21 03:59:18.860697 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:18.860654 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a99f795a44659d090b84b5035a811187ee56175b43d7bc60824826e346b14dea"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 03:59:18.860739 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:18.860721 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" podUID="888bac5f-e281-4267-ae08-fce73b1f965e" containerName="service-proxy" containerID="cri-o://a99f795a44659d090b84b5035a811187ee56175b43d7bc60824826e346b14dea" gracePeriod=30 Apr 21 03:59:19.499030 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:19.498997 2575 generic.go:358] "Generic (PLEG): container finished" podID="888bac5f-e281-4267-ae08-fce73b1f965e" containerID="a99f795a44659d090b84b5035a811187ee56175b43d7bc60824826e346b14dea" exitCode=2 Apr 21 03:59:19.499199 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:19.499051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" event={"ID":"888bac5f-e281-4267-ae08-fce73b1f965e","Type":"ContainerDied","Data":"a99f795a44659d090b84b5035a811187ee56175b43d7bc60824826e346b14dea"} Apr 21 03:59:19.499199 ip-10-0-134-15 kubenswrapper[2575]: I0421 03:59:19.499083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-699587cf6f-ffp95" event={"ID":"888bac5f-e281-4267-ae08-fce73b1f965e","Type":"ContainerStarted","Data":"d1d091b5f6ef7291c609bc9b312494ce74f02724b39a4b45679be3692bedcdd1"} Apr 21 04:01:43.773175 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.773144 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl"] Apr 21 04:01:43.773558 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.773395 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3b6160a-423d-45d3-b52f-9933a1bfdbc6" containerName="registry" Apr 21 04:01:43.773558 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.773406 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b6160a-423d-45d3-b52f-9933a1bfdbc6" containerName="registry" Apr 21 04:01:43.773558 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.773457 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3b6160a-423d-45d3-b52f-9933a1bfdbc6" containerName="registry" Apr 21 04:01:43.776030 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.776015 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:43.778429 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.778406 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 04:01:43.778559 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.778495 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 21 04:01:43.778559 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.778534 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 04:01:43.778769 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.778743 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 04:01:43.779785 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.779747 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-mjx4l\"" Apr 21 04:01:43.779845 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.779775 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 04:01:43.785957 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.785930 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl"] Apr 21 04:01:43.936395 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.936357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45hf\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-kube-api-access-x45hf\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:43.936554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.936414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6f18-adb9-4035-8ac9-50bae2019f45-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:43.936554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:43.936441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:44.037085 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:44.037021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x45hf\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-kube-api-access-x45hf\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:44.037085 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:44.037059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6f18-adb9-4035-8ac9-50bae2019f45-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:44.037085 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:44.037078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:44.037325 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.037175 2575 secret.go:281] references non-existent secret key: tls.crt Apr 21 04:01:44.037325 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.037187 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 04:01:44.037325 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.037203 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl: references non-existent secret key: tls.crt Apr 21 04:01:44.037325 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.037259 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates podName:2f6c6f18-adb9-4035-8ac9-50bae2019f45 nodeName:}" failed. No retries permitted until 2026-04-21 04:01:44.537243472 +0000 UTC m=+290.201833567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates") pod "keda-metrics-apiserver-7c9f485588-thjbl" (UID: "2f6c6f18-adb9-4035-8ac9-50bae2019f45") : references non-existent secret key: tls.crt Apr 21 04:01:44.037501 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:44.037357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6f18-adb9-4035-8ac9-50bae2019f45-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:44.049430 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:44.049409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45hf\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-kube-api-access-x45hf\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:44.541286 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:44.541260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:44.541453 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.541387 2575 secret.go:281] references non-existent secret key: tls.crt Apr 21 04:01:44.541453 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.541401 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 04:01:44.541453 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.541417 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl: references non-existent secret key: tls.crt Apr 21 04:01:44.541570 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:44.541465 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates podName:2f6c6f18-adb9-4035-8ac9-50bae2019f45 nodeName:}" failed. No retries permitted until 2026-04-21 04:01:45.541451771 +0000 UTC m=+291.206041834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates") pod "keda-metrics-apiserver-7c9f485588-thjbl" (UID: "2f6c6f18-adb9-4035-8ac9-50bae2019f45") : references non-existent secret key: tls.crt Apr 21 04:01:45.548819 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:45.548789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:45.549169 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:45.548886 2575 secret.go:281] references non-existent secret key: tls.crt Apr 21 04:01:45.549169 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:45.548899 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 04:01:45.549169 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:45.548915 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl: references non-existent secret key: tls.crt Apr 21 04:01:45.549169 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:01:45.548974 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates podName:2f6c6f18-adb9-4035-8ac9-50bae2019f45 nodeName:}" failed. No retries permitted until 2026-04-21 04:01:47.548960943 +0000 UTC m=+293.213551007 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates") pod "keda-metrics-apiserver-7c9f485588-thjbl" (UID: "2f6c6f18-adb9-4035-8ac9-50bae2019f45") : references non-existent secret key: tls.crt Apr 21 04:01:47.560169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:47.560140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:47.562553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:47.562521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2f6c6f18-adb9-4035-8ac9-50bae2019f45-certificates\") pod \"keda-metrics-apiserver-7c9f485588-thjbl\" (UID: \"2f6c6f18-adb9-4035-8ac9-50bae2019f45\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:47.686408 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:47.686379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:47.797808 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:47.797717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl"] Apr 21 04:01:47.800283 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:01:47.800255 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f6c6f18_adb9_4035_8ac9_50bae2019f45.slice/crio-d2e10e5923277d1f81c1aff554301e7aca1e7f68c29e1ad69899565628163df7 WatchSource:0}: Error finding container d2e10e5923277d1f81c1aff554301e7aca1e7f68c29e1ad69899565628163df7: Status 404 returned error can't find the container with id d2e10e5923277d1f81c1aff554301e7aca1e7f68c29e1ad69899565628163df7 Apr 21 04:01:47.873166 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:47.873141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" event={"ID":"2f6c6f18-adb9-4035-8ac9-50bae2019f45","Type":"ContainerStarted","Data":"d2e10e5923277d1f81c1aff554301e7aca1e7f68c29e1ad69899565628163df7"} Apr 21 04:01:50.883498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:50.883462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" event={"ID":"2f6c6f18-adb9-4035-8ac9-50bae2019f45","Type":"ContainerStarted","Data":"188c03ffdfb747215d9fca309aba1643162ac433f4044e721aa3b0e0daa70205"} Apr 21 04:01:50.883988 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:50.883589 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:01:50.899343 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:50.899304 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" podStartSLOduration=5.201506704 podStartE2EDuration="7.899292686s" podCreationTimestamp="2026-04-21 04:01:43 +0000 UTC" firstStartedPulling="2026-04-21 04:01:47.801539 +0000 UTC m=+293.466129068" lastFinishedPulling="2026-04-21 04:01:50.499324974 +0000 UTC m=+296.163915050" observedRunningTime="2026-04-21 04:01:50.898403184 +0000 UTC m=+296.562993271" watchObservedRunningTime="2026-04-21 04:01:50.899292686 +0000 UTC m=+296.563882768" Apr 21 04:01:54.870976 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:01:54.870948 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:02:01.890528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:01.890499 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-thjbl" Apr 21 04:02:49.468349 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.468275 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-jhrnv"] Apr 21 04:02:49.471262 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.471246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.474187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.474164 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 04:02:49.474293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.474187 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 04:02:49.474293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.474171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 04:02:49.475077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.475060 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qsts\"" Apr 21 04:02:49.478662 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.478643 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jhrnv"] Apr 21 04:02:49.552681 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.552651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d7a895f7-adb0-4db2-9739-b92a466458f7-data\") pod \"seaweedfs-86cc847c5c-jhrnv\" (UID: \"d7a895f7-adb0-4db2-9739-b92a466458f7\") " pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.552877 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.552722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsscg\" (UniqueName: \"kubernetes.io/projected/d7a895f7-adb0-4db2-9739-b92a466458f7-kube-api-access-tsscg\") pod \"seaweedfs-86cc847c5c-jhrnv\" (UID: \"d7a895f7-adb0-4db2-9739-b92a466458f7\") " pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.653495 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.653462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsscg\" (UniqueName: \"kubernetes.io/projected/d7a895f7-adb0-4db2-9739-b92a466458f7-kube-api-access-tsscg\") pod \"seaweedfs-86cc847c5c-jhrnv\" (UID: \"d7a895f7-adb0-4db2-9739-b92a466458f7\") " pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.653651 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.653509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d7a895f7-adb0-4db2-9739-b92a466458f7-data\") pod \"seaweedfs-86cc847c5c-jhrnv\" (UID: \"d7a895f7-adb0-4db2-9739-b92a466458f7\") " pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.653913 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.653887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d7a895f7-adb0-4db2-9739-b92a466458f7-data\") pod \"seaweedfs-86cc847c5c-jhrnv\" (UID: \"d7a895f7-adb0-4db2-9739-b92a466458f7\") " pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.661796 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.661774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsscg\" (UniqueName: \"kubernetes.io/projected/d7a895f7-adb0-4db2-9739-b92a466458f7-kube-api-access-tsscg\") pod \"seaweedfs-86cc847c5c-jhrnv\" (UID: \"d7a895f7-adb0-4db2-9739-b92a466458f7\") " pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.781181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.781086 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:49.893549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.893517 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jhrnv"] Apr 21 04:02:49.897129 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:02:49.897101 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a895f7_adb0_4db2_9739_b92a466458f7.slice/crio-750219f725d5992205b7079b874ffbc9da65727538424661998e98ba235f91cc WatchSource:0}: Error finding container 750219f725d5992205b7079b874ffbc9da65727538424661998e98ba235f91cc: Status 404 returned error can't find the container with id 750219f725d5992205b7079b874ffbc9da65727538424661998e98ba235f91cc Apr 21 04:02:49.898268 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:49.898253 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:02:50.027136 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:50.027101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jhrnv" event={"ID":"d7a895f7-adb0-4db2-9739-b92a466458f7","Type":"ContainerStarted","Data":"750219f725d5992205b7079b874ffbc9da65727538424661998e98ba235f91cc"} Apr 21 04:02:54.038045 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:54.038012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jhrnv" event={"ID":"d7a895f7-adb0-4db2-9739-b92a466458f7","Type":"ContainerStarted","Data":"ee00193bb4218e3102851f73b6907c236fc24860273632216996a67b9f237b7b"} Apr 21 04:02:54.038400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:54.038155 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:02:54.053620 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:02:54.053576 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-jhrnv" podStartSLOduration=1.868541587 podStartE2EDuration="5.053563896s" podCreationTimestamp="2026-04-21 04:02:49 +0000 UTC" firstStartedPulling="2026-04-21 04:02:49.898371669 +0000 UTC m=+355.562961737" lastFinishedPulling="2026-04-21 04:02:53.083393981 +0000 UTC m=+358.747984046" observedRunningTime="2026-04-21 04:02:54.05211118 +0000 UTC m=+359.716701266" watchObservedRunningTime="2026-04-21 04:02:54.053563896 +0000 UTC m=+359.718153983" Apr 21 04:03:00.041926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:03:00.041884 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-jhrnv" Apr 21 04:04:00.134592 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.134559 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-9zk6d"] Apr 21 04:04:00.137563 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.137541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.141391 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.141367 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-jnswd\"" Apr 21 04:04:00.141500 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.141402 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 21 04:04:00.146108 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.146086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9zk6d"] Apr 21 04:04:00.160366 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.160346 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-mbzsg"] Apr 21 04:04:00.163579 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.163559 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.166130 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.166113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 21 04:04:00.166230 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.166117 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-bw4zb\"" Apr 21 04:04:00.173706 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.173683 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-mbzsg"] Apr 21 04:04:00.215145 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.215125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t78gr\" (UniqueName: \"kubernetes.io/projected/1e07702c-3b11-4397-a13a-e55dc0e6cc99-kube-api-access-t78gr\") pod \"model-serving-api-86f7b4b499-9zk6d\" (UID: \"1e07702c-3b11-4397-a13a-e55dc0e6cc99\") " pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.215236 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.215174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e07702c-3b11-4397-a13a-e55dc0e6cc99-tls-certs\") pod \"model-serving-api-86f7b4b499-9zk6d\" (UID: \"1e07702c-3b11-4397-a13a-e55dc0e6cc99\") " pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.315832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.315803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e07702c-3b11-4397-a13a-e55dc0e6cc99-tls-certs\") pod \"model-serving-api-86f7b4b499-9zk6d\" (UID: \"1e07702c-3b11-4397-a13a-e55dc0e6cc99\") " pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.315986 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.315856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwmv\" (UniqueName: \"kubernetes.io/projected/3dba583d-6ff8-4afe-9529-9ad2120fa3e2-kube-api-access-ngwmv\") pod \"odh-model-controller-696fc77849-mbzsg\" (UID: \"3dba583d-6ff8-4afe-9529-9ad2120fa3e2\") " pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.315986 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.315899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dba583d-6ff8-4afe-9529-9ad2120fa3e2-cert\") pod \"odh-model-controller-696fc77849-mbzsg\" (UID: \"3dba583d-6ff8-4afe-9529-9ad2120fa3e2\") " pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.315986 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.315927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t78gr\" (UniqueName: \"kubernetes.io/projected/1e07702c-3b11-4397-a13a-e55dc0e6cc99-kube-api-access-t78gr\") pod \"model-serving-api-86f7b4b499-9zk6d\" (UID: \"1e07702c-3b11-4397-a13a-e55dc0e6cc99\") " pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.318249 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.318230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e07702c-3b11-4397-a13a-e55dc0e6cc99-tls-certs\") pod \"model-serving-api-86f7b4b499-9zk6d\" (UID: \"1e07702c-3b11-4397-a13a-e55dc0e6cc99\") " pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.324332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.324301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t78gr\" (UniqueName: \"kubernetes.io/projected/1e07702c-3b11-4397-a13a-e55dc0e6cc99-kube-api-access-t78gr\") pod \"model-serving-api-86f7b4b499-9zk6d\" (UID: \"1e07702c-3b11-4397-a13a-e55dc0e6cc99\") " pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.417077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.416992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwmv\" (UniqueName: \"kubernetes.io/projected/3dba583d-6ff8-4afe-9529-9ad2120fa3e2-kube-api-access-ngwmv\") pod \"odh-model-controller-696fc77849-mbzsg\" (UID: \"3dba583d-6ff8-4afe-9529-9ad2120fa3e2\") " pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.417077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.417038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dba583d-6ff8-4afe-9529-9ad2120fa3e2-cert\") pod \"odh-model-controller-696fc77849-mbzsg\" (UID: \"3dba583d-6ff8-4afe-9529-9ad2120fa3e2\") " pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.419347 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.419328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dba583d-6ff8-4afe-9529-9ad2120fa3e2-cert\") pod \"odh-model-controller-696fc77849-mbzsg\" (UID: \"3dba583d-6ff8-4afe-9529-9ad2120fa3e2\") " pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.427885 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.427861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwmv\" (UniqueName: \"kubernetes.io/projected/3dba583d-6ff8-4afe-9529-9ad2120fa3e2-kube-api-access-ngwmv\") pod \"odh-model-controller-696fc77849-mbzsg\" (UID: \"3dba583d-6ff8-4afe-9529-9ad2120fa3e2\") " pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.447820 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.447803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:00.476508 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.476484 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:00.567472 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.567444 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9zk6d"] Apr 21 04:04:00.571028 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:04:00.570997 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e07702c_3b11_4397_a13a_e55dc0e6cc99.slice/crio-271cd749d535bd844f99d9aa6441fff40b9371e9975da8d645df4f4162b8c3dc WatchSource:0}: Error finding container 271cd749d535bd844f99d9aa6441fff40b9371e9975da8d645df4f4162b8c3dc: Status 404 returned error can't find the container with id 271cd749d535bd844f99d9aa6441fff40b9371e9975da8d645df4f4162b8c3dc Apr 21 04:04:00.604020 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:00.603991 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-mbzsg"] Apr 21 04:04:00.606461 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:04:00.606432 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dba583d_6ff8_4afe_9529_9ad2120fa3e2.slice/crio-4677d2c9d1ee31c48df0ee442288acdb13a452d3d9c42585ad4ec68d24b79493 WatchSource:0}: Error finding container 4677d2c9d1ee31c48df0ee442288acdb13a452d3d9c42585ad4ec68d24b79493: Status 404 returned error can't find the container with id 4677d2c9d1ee31c48df0ee442288acdb13a452d3d9c42585ad4ec68d24b79493 Apr 21 04:04:01.197272 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:01.197229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9zk6d" event={"ID":"1e07702c-3b11-4397-a13a-e55dc0e6cc99","Type":"ContainerStarted","Data":"271cd749d535bd844f99d9aa6441fff40b9371e9975da8d645df4f4162b8c3dc"} Apr 21 04:04:01.198957 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:01.198923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-mbzsg" event={"ID":"3dba583d-6ff8-4afe-9529-9ad2120fa3e2","Type":"ContainerStarted","Data":"4677d2c9d1ee31c48df0ee442288acdb13a452d3d9c42585ad4ec68d24b79493"} Apr 21 04:04:05.211467 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:05.211432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-mbzsg" event={"ID":"3dba583d-6ff8-4afe-9529-9ad2120fa3e2","Type":"ContainerStarted","Data":"651c8955ac982df8a32020a1689bcefd66bb704e9e88c166c36e0a63fba36f4e"} Apr 21 04:04:05.211893 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:05.211502 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:05.212785 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:05.212747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9zk6d" event={"ID":"1e07702c-3b11-4397-a13a-e55dc0e6cc99","Type":"ContainerStarted","Data":"ecbbbef48abc1474324e76b399bf953253a47d7e48bebb1578b1949771f8dd3e"} Apr 21 04:04:05.212905 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:05.212894 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:05.228988 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:05.228950 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-mbzsg" podStartSLOduration=1.538256274 podStartE2EDuration="5.228937249s" podCreationTimestamp="2026-04-21 04:04:00 +0000 UTC" firstStartedPulling="2026-04-21 04:04:00.607704221 +0000 UTC m=+426.272294286" lastFinishedPulling="2026-04-21 04:04:04.298385183 +0000 UTC m=+429.962975261" observedRunningTime="2026-04-21 04:04:05.228052276 +0000 UTC m=+430.892642605" watchObservedRunningTime="2026-04-21 04:04:05.228937249 +0000 UTC m=+430.893527376" Apr 21 04:04:05.245009 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:05.244963 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-9zk6d" podStartSLOduration=1.5184433579999999 podStartE2EDuration="5.244948682s" podCreationTimestamp="2026-04-21 04:04:00 +0000 UTC" firstStartedPulling="2026-04-21 04:04:00.573198242 +0000 UTC m=+426.237788311" lastFinishedPulling="2026-04-21 04:04:04.299703569 +0000 UTC m=+429.964293635" observedRunningTime="2026-04-21 04:04:05.243539719 +0000 UTC m=+430.908129804" watchObservedRunningTime="2026-04-21 04:04:05.244948682 +0000 UTC m=+430.909538773" Apr 21 04:04:16.217547 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:16.217465 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-mbzsg" Apr 21 04:04:16.219657 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:16.219639 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-9zk6d" Apr 21 04:04:27.821741 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:27.821711 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w"] Apr 21 04:04:27.824802 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:27.824786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:27.827262 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:27.827239 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 21 04:04:27.830669 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:27.830646 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w"] Apr 21 04:04:27.899461 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:27.899437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5de6234-4404-4321-881f-8d0f2e6d2747-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-lbp5w\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:27.899557 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:27.899479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnf7q\" (UniqueName: \"kubernetes.io/projected/b5de6234-4404-4321-881f-8d0f2e6d2747-kube-api-access-xnf7q\") pod \"seaweedfs-tls-custom-ddd4dbfd-lbp5w\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:28.000681 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:28.000657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnf7q\" (UniqueName: \"kubernetes.io/projected/b5de6234-4404-4321-881f-8d0f2e6d2747-kube-api-access-xnf7q\") pod \"seaweedfs-tls-custom-ddd4dbfd-lbp5w\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:28.000786 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:28.000705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5de6234-4404-4321-881f-8d0f2e6d2747-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-lbp5w\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:28.001020 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:28.001005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5de6234-4404-4321-881f-8d0f2e6d2747-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-lbp5w\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:28.011959 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:28.011936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnf7q\" (UniqueName: \"kubernetes.io/projected/b5de6234-4404-4321-881f-8d0f2e6d2747-kube-api-access-xnf7q\") pod \"seaweedfs-tls-custom-ddd4dbfd-lbp5w\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:28.134272 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:28.134253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:28.244438 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:28.244415 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w"] Apr 21 04:04:28.246969 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:04:28.246942 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5de6234_4404_4321_881f_8d0f2e6d2747.slice/crio-78c912b4034d884d5af1b024a2d0f60b03a73039c5d16c2fe40890eccdb9fb00 WatchSource:0}: Error finding container 78c912b4034d884d5af1b024a2d0f60b03a73039c5d16c2fe40890eccdb9fb00: Status 404 returned error can't find the container with id 78c912b4034d884d5af1b024a2d0f60b03a73039c5d16c2fe40890eccdb9fb00 Apr 21 04:04:28.270205 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:28.270176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" event={"ID":"b5de6234-4404-4321-881f-8d0f2e6d2747","Type":"ContainerStarted","Data":"78c912b4034d884d5af1b024a2d0f60b03a73039c5d16c2fe40890eccdb9fb00"} Apr 21 04:04:29.274101 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:29.274067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" event={"ID":"b5de6234-4404-4321-881f-8d0f2e6d2747","Type":"ContainerStarted","Data":"c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404"} Apr 21 04:04:29.290301 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:29.290261 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" podStartSLOduration=2.01268237 podStartE2EDuration="2.290249034s" podCreationTimestamp="2026-04-21 04:04:27 +0000 UTC" firstStartedPulling="2026-04-21 04:04:28.248254346 +0000 UTC m=+453.912844409" lastFinishedPulling="2026-04-21 04:04:28.525821008 +0000 UTC m=+454.190411073" observedRunningTime="2026-04-21 04:04:29.289209896 +0000 UTC m=+454.953800002" watchObservedRunningTime="2026-04-21 04:04:29.290249034 +0000 UTC m=+454.954839120" Apr 21 04:04:30.494615 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:30.494582 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w"] Apr 21 04:04:31.282732 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:31.279988 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" podUID="b5de6234-4404-4321-881f-8d0f2e6d2747" containerName="seaweedfs-tls-custom" containerID="cri-o://c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404" gracePeriod=30 Apr 21 04:04:32.521051 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:32.521028 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:32.533274 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:32.533252 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnf7q\" (UniqueName: \"kubernetes.io/projected/b5de6234-4404-4321-881f-8d0f2e6d2747-kube-api-access-xnf7q\") pod \"b5de6234-4404-4321-881f-8d0f2e6d2747\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " Apr 21 04:04:32.533388 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:32.533342 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5de6234-4404-4321-881f-8d0f2e6d2747-data\") pod \"b5de6234-4404-4321-881f-8d0f2e6d2747\" (UID: \"b5de6234-4404-4321-881f-8d0f2e6d2747\") " Apr 21 04:04:32.534584 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:32.534556 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5de6234-4404-4321-881f-8d0f2e6d2747-data" (OuterVolumeSpecName: "data") pod "b5de6234-4404-4321-881f-8d0f2e6d2747" (UID: "b5de6234-4404-4321-881f-8d0f2e6d2747"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:04:32.535718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:32.535687 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5de6234-4404-4321-881f-8d0f2e6d2747-kube-api-access-xnf7q" (OuterVolumeSpecName: "kube-api-access-xnf7q") pod "b5de6234-4404-4321-881f-8d0f2e6d2747" (UID: "b5de6234-4404-4321-881f-8d0f2e6d2747"). InnerVolumeSpecName "kube-api-access-xnf7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:04:32.633922 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:32.633886 2575 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b5de6234-4404-4321-881f-8d0f2e6d2747-data\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:04:32.633922 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:32.633920 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnf7q\" (UniqueName: \"kubernetes.io/projected/b5de6234-4404-4321-881f-8d0f2e6d2747-kube-api-access-xnf7q\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:04:33.288558 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.288475 2575 generic.go:358] "Generic (PLEG): container finished" podID="b5de6234-4404-4321-881f-8d0f2e6d2747" containerID="c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404" exitCode=0 Apr 21 04:04:33.288558 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.288529 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" Apr 21 04:04:33.288743 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.288559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" event={"ID":"b5de6234-4404-4321-881f-8d0f2e6d2747","Type":"ContainerDied","Data":"c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404"} Apr 21 04:04:33.288743 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.288595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w" event={"ID":"b5de6234-4404-4321-881f-8d0f2e6d2747","Type":"ContainerDied","Data":"78c912b4034d884d5af1b024a2d0f60b03a73039c5d16c2fe40890eccdb9fb00"} Apr 21 04:04:33.288743 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.288612 2575 scope.go:117] "RemoveContainer" containerID="c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404" Apr 21 04:04:33.297006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.296997 2575 scope.go:117] "RemoveContainer" containerID="c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404" Apr 21 04:04:33.297242 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:04:33.297224 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404\": container with ID starting with c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404 not found: ID does not exist" containerID="c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404" Apr 21 04:04:33.297295 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.297249 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404"} err="failed to get container status \"c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404\": rpc error: code = NotFound desc = could not find container \"c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404\": container with ID starting with c833baf4a2f9f40822d39d93096d0f93227cb83511162c1cacf74823bba06404 not found: ID does not exist" Apr 21 04:04:33.302902 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.302880 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w"] Apr 21 04:04:33.304850 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.304830 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-lbp5w"] Apr 21 04:04:33.330087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.330065 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8"] Apr 21 04:04:33.330352 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.330339 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5de6234-4404-4321-881f-8d0f2e6d2747" containerName="seaweedfs-tls-custom" Apr 21 04:04:33.330400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.330354 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5de6234-4404-4321-881f-8d0f2e6d2747" containerName="seaweedfs-tls-custom" Apr 21 04:04:33.330400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.330394 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5de6234-4404-4321-881f-8d0f2e6d2747" containerName="seaweedfs-tls-custom" Apr 21 04:04:33.334409 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.334395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.336587 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.336566 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 21 04:04:33.336834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.336821 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 21 04:04:33.339695 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.339674 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8"] Apr 21 04:04:33.439800 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.439778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1ad35793-a530-4240-9d36-eeca50239573-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.439895 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.439815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx2ng\" (UniqueName: \"kubernetes.io/projected/1ad35793-a530-4240-9d36-eeca50239573-kube-api-access-fx2ng\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.439895 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.439846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ad35793-a530-4240-9d36-eeca50239573-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.540497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.540442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1ad35793-a530-4240-9d36-eeca50239573-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.540497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.540472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx2ng\" (UniqueName: \"kubernetes.io/projected/1ad35793-a530-4240-9d36-eeca50239573-kube-api-access-fx2ng\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.540901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.540507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ad35793-a530-4240-9d36-eeca50239573-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.540901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.540812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ad35793-a530-4240-9d36-eeca50239573-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.542854 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.542837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1ad35793-a530-4240-9d36-eeca50239573-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.548088 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.548067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx2ng\" (UniqueName: \"kubernetes.io/projected/1ad35793-a530-4240-9d36-eeca50239573-kube-api-access-fx2ng\") pod \"seaweedfs-tls-custom-5c88b85bb7-94wk8\" (UID: \"1ad35793-a530-4240-9d36-eeca50239573\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.643693 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.643669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" Apr 21 04:04:33.754567 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:33.754499 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8"] Apr 21 04:04:33.756727 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:04:33.756699 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad35793_a530_4240_9d36_eeca50239573.slice/crio-dc9fd20d1deafc61d6a832761fb08cd7478b4942c340be0344ee7845f9b151f1 WatchSource:0}: Error finding container dc9fd20d1deafc61d6a832761fb08cd7478b4942c340be0344ee7845f9b151f1: Status 404 returned error can't find the container with id dc9fd20d1deafc61d6a832761fb08cd7478b4942c340be0344ee7845f9b151f1 Apr 21 04:04:34.293335 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:34.293249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" event={"ID":"1ad35793-a530-4240-9d36-eeca50239573","Type":"ContainerStarted","Data":"0f195ab772072bdf18fab9ee39c0ac51c3a93710be2465d347b818b18a423c13"} Apr 21 04:04:34.293335 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:34.293289 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" event={"ID":"1ad35793-a530-4240-9d36-eeca50239573","Type":"ContainerStarted","Data":"dc9fd20d1deafc61d6a832761fb08cd7478b4942c340be0344ee7845f9b151f1"} Apr 21 04:04:34.308490 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:34.308434 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94wk8" podStartSLOduration=1.057885798 podStartE2EDuration="1.308421823s" podCreationTimestamp="2026-04-21 04:04:33 +0000 UTC" firstStartedPulling="2026-04-21 04:04:33.75787136 +0000 UTC m=+459.422461427" lastFinishedPulling="2026-04-21 04:04:34.008407387 +0000 UTC m=+459.672997452" observedRunningTime="2026-04-21 04:04:34.307250841 +0000 UTC m=+459.971840927" watchObservedRunningTime="2026-04-21 04:04:34.308421823 +0000 UTC m=+459.973011945" Apr 21 04:04:35.027480 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:35.027449 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5de6234-4404-4321-881f-8d0f2e6d2747" path="/var/lib/kubelet/pods/b5de6234-4404-4321-881f-8d0f2e6d2747/volumes" Apr 21 04:04:43.714645 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.714618 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7"] Apr 21 04:04:43.717597 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.717582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.719988 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.719970 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 21 04:04:43.720068 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.720048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 21 04:04:43.724298 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.724274 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7"] Apr 21 04:04:43.812910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.812878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/78d45106-756d-4814-b4a2-020d8e60ccb9-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.813019 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.812939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/78d45106-756d-4814-b4a2-020d8e60ccb9-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.813019 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.813003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldqqr\" (UniqueName: \"kubernetes.io/projected/78d45106-756d-4814-b4a2-020d8e60ccb9-kube-api-access-ldqqr\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.914288 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.914258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldqqr\" (UniqueName: \"kubernetes.io/projected/78d45106-756d-4814-b4a2-020d8e60ccb9-kube-api-access-ldqqr\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.914440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.914306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/78d45106-756d-4814-b4a2-020d8e60ccb9-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.914440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.914324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/78d45106-756d-4814-b4a2-020d8e60ccb9-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.914639 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.914620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/78d45106-756d-4814-b4a2-020d8e60ccb9-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.916598 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.916575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/78d45106-756d-4814-b4a2-020d8e60ccb9-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:43.922559 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:43.922536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldqqr\" (UniqueName: \"kubernetes.io/projected/78d45106-756d-4814-b4a2-020d8e60ccb9-kube-api-access-ldqqr\") pod \"seaweedfs-tls-serving-7fd5766db9-sxjd7\" (UID: \"78d45106-756d-4814-b4a2-020d8e60ccb9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:44.028065 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:44.027999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" Apr 21 04:04:44.138602 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:44.138577 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7"] Apr 21 04:04:44.143276 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:04:44.143244 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d45106_756d_4814_b4a2_020d8e60ccb9.slice/crio-39afc99911e9071ac5433231259324572e57ab57f7696709d7d5e53d99bb44cb WatchSource:0}: Error finding container 39afc99911e9071ac5433231259324572e57ab57f7696709d7d5e53d99bb44cb: Status 404 returned error can't find the container with id 39afc99911e9071ac5433231259324572e57ab57f7696709d7d5e53d99bb44cb Apr 21 04:04:44.319320 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:44.319257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" event={"ID":"78d45106-756d-4814-b4a2-020d8e60ccb9","Type":"ContainerStarted","Data":"39afc99911e9071ac5433231259324572e57ab57f7696709d7d5e53d99bb44cb"} Apr 21 04:04:45.323589 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:45.323555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" event={"ID":"78d45106-756d-4814-b4a2-020d8e60ccb9","Type":"ContainerStarted","Data":"39350b13c38bdf2d67f2a9489b01d611f423f1b59e8b9ccef7fe9a6494570305"} Apr 21 04:04:45.338165 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:04:45.338118 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sxjd7" podStartSLOduration=1.998046361 podStartE2EDuration="2.338105088s" podCreationTimestamp="2026-04-21 04:04:43 +0000 UTC" firstStartedPulling="2026-04-21 04:04:44.145197979 +0000 UTC m=+469.809788044" lastFinishedPulling="2026-04-21 04:04:44.485256699 +0000 UTC m=+470.149846771" observedRunningTime="2026-04-21 04:04:45.337527801 +0000 UTC m=+471.002117897" watchObservedRunningTime="2026-04-21 04:04:45.338105088 +0000 UTC m=+471.002695174" Apr 21 04:05:03.172132 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.172099 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp"] Apr 21 04:05:03.178910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.178882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.181734 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.181712 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wxnxz\"" Apr 21 04:05:03.182007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.181988 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 21 04:05:03.182148 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.182024 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 21 04:05:03.182238 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.182060 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 21 04:05:03.182374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.182069 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 04:05:03.182445 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.182283 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp"] Apr 21 04:05:03.250160 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.250138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/213f870a-9cea-44a6-a0ef-057957323d2c-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.250287 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.250175 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/213f870a-9cea-44a6-a0ef-057957323d2c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.250287 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.250248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdq5p\" (UniqueName: \"kubernetes.io/projected/213f870a-9cea-44a6-a0ef-057957323d2c-kube-api-access-qdq5p\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.250404 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.250351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/213f870a-9cea-44a6-a0ef-057957323d2c-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.351357 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.351329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/213f870a-9cea-44a6-a0ef-057957323d2c-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.351456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.351378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/213f870a-9cea-44a6-a0ef-057957323d2c-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.351456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.351409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/213f870a-9cea-44a6-a0ef-057957323d2c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.351553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.351462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdq5p\" (UniqueName: \"kubernetes.io/projected/213f870a-9cea-44a6-a0ef-057957323d2c-kube-api-access-qdq5p\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.351802 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.351783 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/213f870a-9cea-44a6-a0ef-057957323d2c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.352049 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.352031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/213f870a-9cea-44a6-a0ef-057957323d2c-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.353991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.353970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/213f870a-9cea-44a6-a0ef-057957323d2c-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.359601 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.359576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdq5p\" (UniqueName: \"kubernetes.io/projected/213f870a-9cea-44a6-a0ef-057957323d2c-kube-api-access-qdq5p\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.490195 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.490125 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:03.604613 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:03.604596 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp"] Apr 21 04:05:03.606876 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:05:03.606843 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod213f870a_9cea_44a6_a0ef_057957323d2c.slice/crio-cec6703d4c5bc60b05454cd4fa533e55ff3f4338a54820ff841b445062c72af6 WatchSource:0}: Error finding container cec6703d4c5bc60b05454cd4fa533e55ff3f4338a54820ff841b445062c72af6: Status 404 returned error can't find the container with id cec6703d4c5bc60b05454cd4fa533e55ff3f4338a54820ff841b445062c72af6 Apr 21 04:05:04.374130 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:04.374088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerStarted","Data":"cec6703d4c5bc60b05454cd4fa533e55ff3f4338a54820ff841b445062c72af6"} Apr 21 04:05:08.387806 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:08.387770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerStarted","Data":"1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460"} Apr 21 04:05:12.399310 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:12.399276 2575 generic.go:358] "Generic (PLEG): container finished" podID="213f870a-9cea-44a6-a0ef-057957323d2c" containerID="1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460" exitCode=0 Apr 21 04:05:12.399708 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:12.399360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerDied","Data":"1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460"} Apr 21 04:05:26.447874 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:26.447832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerStarted","Data":"30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef"} Apr 21 04:05:28.455412 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:28.455340 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerStarted","Data":"f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d"} Apr 21 04:05:31.466287 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:31.466214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerStarted","Data":"d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2"} Apr 21 04:05:31.466621 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:31.466334 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:31.487177 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:31.487131 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podStartSLOduration=0.907634396 podStartE2EDuration="28.48711722s" podCreationTimestamp="2026-04-21 04:05:03 +0000 UTC" firstStartedPulling="2026-04-21 04:05:03.608806623 +0000 UTC m=+489.273396686" lastFinishedPulling="2026-04-21 04:05:31.188289443 +0000 UTC m=+516.852879510" observedRunningTime="2026-04-21 04:05:31.485544783 +0000 UTC m=+517.150134869" watchObservedRunningTime="2026-04-21 04:05:31.48711722 +0000 UTC m=+517.151707307" Apr 21 04:05:32.473294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:32.472722 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:32.473294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:32.472736 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:05:32.473294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:32.472780 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:32.473857 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:32.473808 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:05:33.473319 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:33.473277 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:05:33.473753 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:33.473621 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:05:33.476842 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:33.476821 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:05:34.476242 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:34.476201 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:05:34.476740 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:34.476544 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:05:35.478374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:35.478334 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:05:35.478740 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:35.478652 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:05:45.479150 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:45.479045 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:05:45.479552 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:45.479235 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:05:55.478442 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:55.478396 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:05:55.478847 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:05:55.478823 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:06:05.478804 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:05.478732 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:06:05.479286 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:05.479128 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:06:15.478487 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:15.478443 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:06:15.478977 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:15.478954 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:06:25.478356 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:25.478312 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:06:25.478803 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:25.478781 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:06:35.478942 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:35.478913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:06:35.479418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:35.478973 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:06:48.278197 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.278166 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp"] Apr 21 04:06:48.278655 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.278483 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" containerID="cri-o://30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef" gracePeriod=30 Apr 21 04:06:48.278655 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.278523 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" containerID="cri-o://d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2" gracePeriod=30 Apr 21 04:06:48.278784 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.278532 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" containerID="cri-o://f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d" gracePeriod=30 Apr 21 04:06:48.393929 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.393890 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v"] Apr 21 04:06:48.397946 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.397929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.400368 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.400351 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 21 04:06:48.400474 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.400427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 21 04:06:48.407124 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.407100 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v"] Apr 21 04:06:48.473371 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.473343 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 21 04:06:48.530234 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.530175 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8605b0-143b-4fc4-8053-a54718ec065f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.530234 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.530212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8605b0-143b-4fc4-8053-a54718ec065f-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.530374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.530298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcxsk\" (UniqueName: \"kubernetes.io/projected/6a8605b0-143b-4fc4-8053-a54718ec065f-kube-api-access-hcxsk\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.530374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.530356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8605b0-143b-4fc4-8053-a54718ec065f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.631376 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.631349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8605b0-143b-4fc4-8053-a54718ec065f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.631469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.631382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8605b0-143b-4fc4-8053-a54718ec065f-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.631469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.631418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcxsk\" (UniqueName: \"kubernetes.io/projected/6a8605b0-143b-4fc4-8053-a54718ec065f-kube-api-access-hcxsk\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.631469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.631464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8605b0-143b-4fc4-8053-a54718ec065f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.631939 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.631920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8605b0-143b-4fc4-8053-a54718ec065f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.632012 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.631994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8605b0-143b-4fc4-8053-a54718ec065f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.633796 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.633779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8605b0-143b-4fc4-8053-a54718ec065f-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.639227 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.639208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcxsk\" (UniqueName: \"kubernetes.io/projected/6a8605b0-143b-4fc4-8053-a54718ec065f-kube-api-access-hcxsk\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.683894 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.683861 2575 generic.go:358] "Generic (PLEG): container finished" podID="213f870a-9cea-44a6-a0ef-057957323d2c" containerID="f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d" exitCode=2 Apr 21 04:06:48.683985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.683914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerDied","Data":"f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d"} Apr 21 04:06:48.708047 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.708031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:48.827378 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:48.827354 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v"] Apr 21 04:06:48.830071 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:06:48.830038 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8605b0_143b_4fc4_8053_a54718ec065f.slice/crio-7a4a4d9036a9cabaecbd980c84e0fcbe2570ecb5c072af66e85a064a66c45f4c WatchSource:0}: Error finding container 7a4a4d9036a9cabaecbd980c84e0fcbe2570ecb5c072af66e85a064a66c45f4c: Status 404 returned error can't find the container with id 7a4a4d9036a9cabaecbd980c84e0fcbe2570ecb5c072af66e85a064a66c45f4c Apr 21 04:06:49.687682 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:49.687642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerStarted","Data":"b392afc37e1ee42d3c41862693de76fb59b01f1bd932300a0fa4e67c3bebe6bd"} Apr 21 04:06:49.687682 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:49.687681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerStarted","Data":"7a4a4d9036a9cabaecbd980c84e0fcbe2570ecb5c072af66e85a064a66c45f4c"} Apr 21 04:06:52.698199 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:52.698162 2575 generic.go:358] "Generic (PLEG): container finished" podID="213f870a-9cea-44a6-a0ef-057957323d2c" containerID="30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef" exitCode=0 Apr 21 04:06:52.698551 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:52.698239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerDied","Data":"30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef"} Apr 21 04:06:53.473806 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:53.473744 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 21 04:06:53.701852 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:53.701818 2575 generic.go:358] "Generic (PLEG): container finished" podID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerID="b392afc37e1ee42d3c41862693de76fb59b01f1bd932300a0fa4e67c3bebe6bd" exitCode=0 Apr 21 04:06:53.702216 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:53.701896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerDied","Data":"b392afc37e1ee42d3c41862693de76fb59b01f1bd932300a0fa4e67c3bebe6bd"} Apr 21 04:06:54.706264 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.706232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerStarted","Data":"8378b963ccb72ebb78eeca436e82331727dd93d3e030b31aef738c5525a10a73"} Apr 21 04:06:54.706264 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.706270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerStarted","Data":"7a6359bfa05905dfa800da93edfc4df7859a5623567d6b2848f9781b9c73df81"} Apr 21 04:06:54.706698 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.706280 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerStarted","Data":"2fbd1774b5a62b93e5a4d6ddfa43a3724976b6660d0fd0c78aa41dd6f8c45a0c"} Apr 21 04:06:54.706776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.706725 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:54.706776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.706750 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:54.706776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.706775 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:06:54.708120 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.708091 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:06:54.708707 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.708687 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:06:54.726407 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:54.726369 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podStartSLOduration=6.726358548 podStartE2EDuration="6.726358548s" podCreationTimestamp="2026-04-21 04:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:06:54.724420019 +0000 UTC m=+600.389010099" watchObservedRunningTime="2026-04-21 04:06:54.726358548 +0000 UTC m=+600.390948634" Apr 21 04:06:55.479040 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:55.478989 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:06:55.479375 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:55.479348 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:06:55.709705 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:55.709663 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:06:55.710188 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:55.710110 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:06:58.474345 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:58.474304 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 21 04:06:58.474787 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:06:58.474452 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:07:00.713143 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:00.713113 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:07:00.713820 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:00.713790 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:07:00.714029 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:00.714007 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:07:03.473806 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:03.473738 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 21 04:07:05.479159 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:05.479118 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:07:05.479596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:05.479432 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:07:08.474341 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:08.474296 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 21 04:07:10.714543 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:10.714497 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:07:10.714966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:10.714942 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:07:13.474390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:13.474352 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 21 04:07:15.478491 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:15.478402 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 21 04:07:15.478901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:15.478559 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:07:15.478901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:15.478672 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:07:15.478901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:15.478840 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:07:18.422353 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.422331 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:07:18.541360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.541281 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/213f870a-9cea-44a6-a0ef-057957323d2c-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"213f870a-9cea-44a6-a0ef-057957323d2c\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " Apr 21 04:07:18.541360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.541321 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/213f870a-9cea-44a6-a0ef-057957323d2c-kserve-provision-location\") pod \"213f870a-9cea-44a6-a0ef-057957323d2c\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " Apr 21 04:07:18.541360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.541343 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/213f870a-9cea-44a6-a0ef-057957323d2c-proxy-tls\") pod \"213f870a-9cea-44a6-a0ef-057957323d2c\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " Apr 21 04:07:18.541626 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.541484 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdq5p\" (UniqueName: \"kubernetes.io/projected/213f870a-9cea-44a6-a0ef-057957323d2c-kube-api-access-qdq5p\") pod \"213f870a-9cea-44a6-a0ef-057957323d2c\" (UID: \"213f870a-9cea-44a6-a0ef-057957323d2c\") " Apr 21 04:07:18.541710 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.541680 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/213f870a-9cea-44a6-a0ef-057957323d2c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "213f870a-9cea-44a6-a0ef-057957323d2c" (UID: "213f870a-9cea-44a6-a0ef-057957323d2c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:07:18.541831 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.541721 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213f870a-9cea-44a6-a0ef-057957323d2c-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "213f870a-9cea-44a6-a0ef-057957323d2c" (UID: "213f870a-9cea-44a6-a0ef-057957323d2c"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:07:18.543610 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.543584 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213f870a-9cea-44a6-a0ef-057957323d2c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "213f870a-9cea-44a6-a0ef-057957323d2c" (UID: "213f870a-9cea-44a6-a0ef-057957323d2c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:07:18.543720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.543583 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/213f870a-9cea-44a6-a0ef-057957323d2c-kube-api-access-qdq5p" (OuterVolumeSpecName: "kube-api-access-qdq5p") pod "213f870a-9cea-44a6-a0ef-057957323d2c" (UID: "213f870a-9cea-44a6-a0ef-057957323d2c"). InnerVolumeSpecName "kube-api-access-qdq5p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:07:18.642706 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.642673 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdq5p\" (UniqueName: \"kubernetes.io/projected/213f870a-9cea-44a6-a0ef-057957323d2c-kube-api-access-qdq5p\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:07:18.642706 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.642703 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/213f870a-9cea-44a6-a0ef-057957323d2c-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:07:18.642706 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.642716 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/213f870a-9cea-44a6-a0ef-057957323d2c-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:07:18.643002 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.642725 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/213f870a-9cea-44a6-a0ef-057957323d2c-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:07:18.773569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.773539 2575 generic.go:358] "Generic (PLEG): container finished" podID="213f870a-9cea-44a6-a0ef-057957323d2c" containerID="d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2" exitCode=0 Apr 21 04:07:18.773721 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.773629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerDied","Data":"d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2"} Apr 21 04:07:18.773721 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.773649 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" Apr 21 04:07:18.773721 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.773669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp" event={"ID":"213f870a-9cea-44a6-a0ef-057957323d2c","Type":"ContainerDied","Data":"cec6703d4c5bc60b05454cd4fa533e55ff3f4338a54820ff841b445062c72af6"} Apr 21 04:07:18.773721 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.773687 2575 scope.go:117] "RemoveContainer" containerID="d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2" Apr 21 04:07:18.783970 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.783950 2575 scope.go:117] "RemoveContainer" containerID="f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d" Apr 21 04:07:18.790558 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.790540 2575 scope.go:117] "RemoveContainer" containerID="30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef" Apr 21 04:07:18.795911 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.795860 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp"] Apr 21 04:07:18.797426 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.797411 2575 scope.go:117] "RemoveContainer" containerID="1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460" Apr 21 04:07:18.801489 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.801466 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-k2qvp"] Apr 21 04:07:18.804222 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.804205 2575 scope.go:117] "RemoveContainer" containerID="d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2" Apr 21 04:07:18.804446 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:07:18.804428 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2\": container with ID starting with d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2 not found: ID does not exist" containerID="d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2" Apr 21 04:07:18.804493 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.804463 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2"} err="failed to get container status \"d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2\": rpc error: code = NotFound desc = could not find container \"d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2\": container with ID starting with d626ecf8f257de9d55b185771b1d4746b32c8844a3e126222ade6362b67ba1c2 not found: ID does not exist" Apr 21 04:07:18.804493 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.804482 2575 scope.go:117] "RemoveContainer" containerID="f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d" Apr 21 04:07:18.804711 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:07:18.804695 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d\": container with ID starting with f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d not found: ID does not exist" containerID="f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d" Apr 21 04:07:18.804769 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.804717 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d"} err="failed to get container status \"f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d\": rpc error: code = NotFound desc = could not find container \"f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d\": container with ID starting with f8d6e36c2b9eff92cc03ae6bdb2d2333c143204f88f61f705095051c145bde4d not found: ID does not exist" Apr 21 04:07:18.804769 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.804734 2575 scope.go:117] "RemoveContainer" containerID="30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef" Apr 21 04:07:18.804956 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:07:18.804941 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef\": container with ID starting with 30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef not found: ID does not exist" containerID="30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef" Apr 21 04:07:18.804999 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.804960 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef"} err="failed to get container status \"30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef\": rpc error: code = NotFound desc = could not find container \"30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef\": container with ID starting with 30a08428d8bec4f99886999ab3d30f054f8468447f5b0681925ac6af9f2fc0ef not found: ID does not exist" Apr 21 04:07:18.804999 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.804974 2575 scope.go:117] "RemoveContainer" containerID="1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460" Apr 21 04:07:18.805196 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:07:18.805180 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460\": container with ID starting with 1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460 not found: ID does not exist" containerID="1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460" Apr 21 04:07:18.805243 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:18.805199 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460"} err="failed to get container status \"1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460\": rpc error: code = NotFound desc = could not find container \"1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460\": container with ID starting with 1a4332a6b0142e33c506f9d5fa13aeaebf3500a9d86c7b5eda60b1318a293460 not found: ID does not exist" Apr 21 04:07:19.028027 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:19.027994 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" path="/var/lib/kubelet/pods/213f870a-9cea-44a6-a0ef-057957323d2c/volumes" Apr 21 04:07:20.714467 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:20.714417 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:07:20.714915 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:20.714883 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:07:30.714173 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:30.714128 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:07:30.714677 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:30.714545 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:07:40.714378 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:40.714324 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:07:40.714806 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:40.714657 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:07:50.714227 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:50.714183 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:07:50.714688 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:07:50.714637 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:08:00.713944 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:00.713904 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:08:00.714414 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:00.714079 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:08:13.443652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.443616 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v"] Apr 21 04:08:13.444247 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.444020 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" containerID="cri-o://8378b963ccb72ebb78eeca436e82331727dd93d3e030b31aef738c5525a10a73" gracePeriod=30 Apr 21 04:08:13.444247 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.444085 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" containerID="cri-o://7a6359bfa05905dfa800da93edfc4df7859a5623567d6b2848f9781b9c73df81" gracePeriod=30 Apr 21 04:08:13.444247 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.444020 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" containerID="cri-o://2fbd1774b5a62b93e5a4d6ddfa43a3724976b6660d0fd0c78aa41dd6f8c45a0c" gracePeriod=30 Apr 21 04:08:13.505707 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.505676 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb"] Apr 21 04:08:13.506030 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506015 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" Apr 21 04:08:13.506077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506032 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" Apr 21 04:08:13.506077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506045 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" Apr 21 04:08:13.506077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506051 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" Apr 21 04:08:13.506077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506060 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="storage-initializer" Apr 21 04:08:13.506077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506067 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="storage-initializer" Apr 21 04:08:13.506077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506077 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" Apr 21 04:08:13.506245 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506082 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" Apr 21 04:08:13.506245 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506125 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kserve-container" Apr 21 04:08:13.506245 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506134 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="kube-rbac-proxy" Apr 21 04:08:13.506245 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.506140 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="213f870a-9cea-44a6-a0ef-057957323d2c" containerName="agent" Apr 21 04:08:13.510311 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.510295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.512660 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.512638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 21 04:08:13.512660 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.512648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 21 04:08:13.518373 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.518354 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb"] Apr 21 04:08:13.607047 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.607021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.607137 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.607050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvb29\" (UniqueName: \"kubernetes.io/projected/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-kube-api-access-zvb29\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.607137 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.607075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.708055 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.707983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.708055 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.708015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvb29\" (UniqueName: \"kubernetes.io/projected/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-kube-api-access-zvb29\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.708055 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.708042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.708276 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:08:13.708134 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 21 04:08:13.708276 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:08:13.708194 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls podName:6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b nodeName:}" failed. No retries permitted until 2026-04-21 04:08:14.208174774 +0000 UTC m=+679.872764843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-l98rb" (UID: "6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b") : secret "message-dumper-predictor-serving-cert" not found Apr 21 04:08:13.708605 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.708586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.716214 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.716191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvb29\" (UniqueName: \"kubernetes.io/projected/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-kube-api-access-zvb29\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:13.924318 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.924285 2575 generic.go:358] "Generic (PLEG): container finished" podID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerID="7a6359bfa05905dfa800da93edfc4df7859a5623567d6b2848f9781b9c73df81" exitCode=2 Apr 21 04:08:13.924462 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:13.924347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerDied","Data":"7a6359bfa05905dfa800da93edfc4df7859a5623567d6b2848f9781b9c73df81"} Apr 21 04:08:14.210884 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:14.210840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:14.213181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:14.213156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-l98rb\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:14.420898 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:14.420862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:14.534360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:14.534343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb"] Apr 21 04:08:14.538906 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:08:14.538878 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b08d9d7_a5e2_48d7_abbb_5eb4ae2f5a0b.slice/crio-34e72198ea23dfd0f7bebf67078cbea7818d5199266a608111a2ff588c6e5105 WatchSource:0}: Error finding container 34e72198ea23dfd0f7bebf67078cbea7818d5199266a608111a2ff588c6e5105: Status 404 returned error can't find the container with id 34e72198ea23dfd0f7bebf67078cbea7818d5199266a608111a2ff588c6e5105 Apr 21 04:08:14.541112 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:14.541095 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:08:14.928116 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:14.928086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" event={"ID":"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b","Type":"ContainerStarted","Data":"34e72198ea23dfd0f7bebf67078cbea7818d5199266a608111a2ff588c6e5105"} Apr 21 04:08:15.710347 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:15.710318 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 21 04:08:15.932616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:15.932581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" event={"ID":"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b","Type":"ContainerStarted","Data":"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f"} Apr 21 04:08:15.932752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:15.932624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" event={"ID":"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b","Type":"ContainerStarted","Data":"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868"} Apr 21 04:08:15.932752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:15.932724 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:15.950495 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:15.950439 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" podStartSLOduration=1.789689057 podStartE2EDuration="2.95042363s" podCreationTimestamp="2026-04-21 04:08:13 +0000 UTC" firstStartedPulling="2026-04-21 04:08:14.541224073 +0000 UTC m=+680.205814138" lastFinishedPulling="2026-04-21 04:08:15.701958643 +0000 UTC m=+681.366548711" observedRunningTime="2026-04-21 04:08:15.948514704 +0000 UTC m=+681.613104789" watchObservedRunningTime="2026-04-21 04:08:15.95042363 +0000 UTC m=+681.615013716" Apr 21 04:08:16.935465 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:16.935441 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:16.937194 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:16.937171 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:17.939594 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:17.939561 2575 generic.go:358] "Generic (PLEG): container finished" podID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerID="2fbd1774b5a62b93e5a4d6ddfa43a3724976b6660d0fd0c78aa41dd6f8c45a0c" exitCode=0 Apr 21 04:08:17.939991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:17.939626 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerDied","Data":"2fbd1774b5a62b93e5a4d6ddfa43a3724976b6660d0fd0c78aa41dd6f8c45a0c"} Apr 21 04:08:20.709754 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:20.709705 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 21 04:08:20.714035 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:20.713996 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:08:20.714326 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:20.714299 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:08:23.947940 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:23.947910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:08:25.710336 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:25.710293 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 21 04:08:25.710721 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:25.710427 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:08:30.709953 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:30.709908 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 21 04:08:30.714206 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:30.714166 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:08:30.714486 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:30.714462 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:08:33.577579 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.577550 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t"] Apr 21 04:08:33.580612 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.580595 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.583111 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.583090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 21 04:08:33.583213 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.583196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 21 04:08:33.590694 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.590670 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t"] Apr 21 04:08:33.649953 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.649921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c6245e6-0f47-41dd-8536-df98877dceb9-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.650100 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.650065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl8k9\" (UniqueName: \"kubernetes.io/projected/4c6245e6-0f47-41dd-8536-df98877dceb9-kube-api-access-bl8k9\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.650176 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.650154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c6245e6-0f47-41dd-8536-df98877dceb9-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.650219 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.650204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c6245e6-0f47-41dd-8536-df98877dceb9-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.750689 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.750667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c6245e6-0f47-41dd-8536-df98877dceb9-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.750832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.750712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl8k9\" (UniqueName: \"kubernetes.io/projected/4c6245e6-0f47-41dd-8536-df98877dceb9-kube-api-access-bl8k9\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.750832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.750800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c6245e6-0f47-41dd-8536-df98877dceb9-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.750964 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.750833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c6245e6-0f47-41dd-8536-df98877dceb9-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.751118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.751097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c6245e6-0f47-41dd-8536-df98877dceb9-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.751406 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.751386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c6245e6-0f47-41dd-8536-df98877dceb9-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.753110 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.753090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c6245e6-0f47-41dd-8536-df98877dceb9-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.759278 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.759255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl8k9\" (UniqueName: \"kubernetes.io/projected/4c6245e6-0f47-41dd-8536-df98877dceb9-kube-api-access-bl8k9\") pod \"isvc-logger-predictor-64d54fcc88-sc54t\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:33.891062 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:33.891029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:34.009869 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:34.009843 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t"] Apr 21 04:08:34.012089 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:08:34.012056 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c6245e6_0f47_41dd_8536_df98877dceb9.slice/crio-944b62e81fec0891c2276318fa475b440b7bd512fdec053dd39c8139bccb7871 WatchSource:0}: Error finding container 944b62e81fec0891c2276318fa475b440b7bd512fdec053dd39c8139bccb7871: Status 404 returned error can't find the container with id 944b62e81fec0891c2276318fa475b440b7bd512fdec053dd39c8139bccb7871 Apr 21 04:08:34.986862 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:34.986826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerStarted","Data":"92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45"} Apr 21 04:08:34.986862 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:34.986860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerStarted","Data":"944b62e81fec0891c2276318fa475b440b7bd512fdec053dd39c8139bccb7871"} Apr 21 04:08:35.710080 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:35.710039 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 21 04:08:37.997466 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:37.997381 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerID="92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45" exitCode=0 Apr 21 04:08:37.997842 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:37.997457 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerDied","Data":"92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45"} Apr 21 04:08:39.002228 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:39.002195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerStarted","Data":"fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa"} Apr 21 04:08:39.002568 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:39.002235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerStarted","Data":"2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313"} Apr 21 04:08:39.002568 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:39.002249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerStarted","Data":"4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd"} Apr 21 04:08:39.002568 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:39.002543 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:39.002705 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:39.002685 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:39.003750 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:39.003726 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:08:39.021469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:39.021427 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podStartSLOduration=6.021414769 podStartE2EDuration="6.021414769s" podCreationTimestamp="2026-04-21 04:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:08:39.020347933 +0000 UTC m=+704.684938019" watchObservedRunningTime="2026-04-21 04:08:39.021414769 +0000 UTC m=+704.686004855" Apr 21 04:08:40.004733 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.004675 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:08:40.005157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.004699 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:40.005619 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.005594 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:08:40.710379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.710343 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 21 04:08:40.713653 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.713624 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 21 04:08:40.713821 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.713798 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:08:40.714039 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.714020 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:08:40.714118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:40.714106 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:08:41.007292 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:41.007205 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:08:41.007746 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:41.007722 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:08:44.018556 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.018527 2575 generic.go:358] "Generic (PLEG): container finished" podID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerID="8378b963ccb72ebb78eeca436e82331727dd93d3e030b31aef738c5525a10a73" exitCode=0 Apr 21 04:08:44.018870 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.018573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerDied","Data":"8378b963ccb72ebb78eeca436e82331727dd93d3e030b31aef738c5525a10a73"} Apr 21 04:08:44.080745 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.080723 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:08:44.226317 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.226223 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8605b0-143b-4fc4-8053-a54718ec065f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"6a8605b0-143b-4fc4-8053-a54718ec065f\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " Apr 21 04:08:44.226317 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.226284 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8605b0-143b-4fc4-8053-a54718ec065f-proxy-tls\") pod \"6a8605b0-143b-4fc4-8053-a54718ec065f\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " Apr 21 04:08:44.226494 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.226334 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcxsk\" (UniqueName: \"kubernetes.io/projected/6a8605b0-143b-4fc4-8053-a54718ec065f-kube-api-access-hcxsk\") pod \"6a8605b0-143b-4fc4-8053-a54718ec065f\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " Apr 21 04:08:44.226494 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.226358 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8605b0-143b-4fc4-8053-a54718ec065f-kserve-provision-location\") pod \"6a8605b0-143b-4fc4-8053-a54718ec065f\" (UID: \"6a8605b0-143b-4fc4-8053-a54718ec065f\") " Apr 21 04:08:44.226624 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.226596 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8605b0-143b-4fc4-8053-a54718ec065f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "6a8605b0-143b-4fc4-8053-a54718ec065f" (UID: "6a8605b0-143b-4fc4-8053-a54718ec065f"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:08:44.226721 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.226701 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a8605b0-143b-4fc4-8053-a54718ec065f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a8605b0-143b-4fc4-8053-a54718ec065f" (UID: "6a8605b0-143b-4fc4-8053-a54718ec065f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:08:44.228327 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.228302 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8605b0-143b-4fc4-8053-a54718ec065f-kube-api-access-hcxsk" (OuterVolumeSpecName: "kube-api-access-hcxsk") pod "6a8605b0-143b-4fc4-8053-a54718ec065f" (UID: "6a8605b0-143b-4fc4-8053-a54718ec065f"). InnerVolumeSpecName "kube-api-access-hcxsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:08:44.228436 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.228334 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8605b0-143b-4fc4-8053-a54718ec065f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6a8605b0-143b-4fc4-8053-a54718ec065f" (UID: "6a8605b0-143b-4fc4-8053-a54718ec065f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:08:44.327885 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.327860 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8605b0-143b-4fc4-8053-a54718ec065f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:08:44.327885 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.327881 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8605b0-143b-4fc4-8053-a54718ec065f-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:08:44.328013 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.327891 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hcxsk\" (UniqueName: \"kubernetes.io/projected/6a8605b0-143b-4fc4-8053-a54718ec065f-kube-api-access-hcxsk\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:08:44.328013 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:44.327901 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8605b0-143b-4fc4-8053-a54718ec065f-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:08:45.022929 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.022894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" event={"ID":"6a8605b0-143b-4fc4-8053-a54718ec065f","Type":"ContainerDied","Data":"7a4a4d9036a9cabaecbd980c84e0fcbe2570ecb5c072af66e85a064a66c45f4c"} Apr 21 04:08:45.022929 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.022937 2575 scope.go:117] "RemoveContainer" containerID="8378b963ccb72ebb78eeca436e82331727dd93d3e030b31aef738c5525a10a73" Apr 21 04:08:45.023402 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.022976 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v" Apr 21 04:08:45.030817 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.030786 2575 scope.go:117] "RemoveContainer" containerID="7a6359bfa05905dfa800da93edfc4df7859a5623567d6b2848f9781b9c73df81" Apr 21 04:08:45.038058 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.038038 2575 scope.go:117] "RemoveContainer" containerID="2fbd1774b5a62b93e5a4d6ddfa43a3724976b6660d0fd0c78aa41dd6f8c45a0c" Apr 21 04:08:45.045233 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.045185 2575 scope.go:117] "RemoveContainer" containerID="b392afc37e1ee42d3c41862693de76fb59b01f1bd932300a0fa4e67c3bebe6bd" Apr 21 04:08:45.047965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.047943 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v"] Apr 21 04:08:45.051348 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:45.051326 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-jgg2v"] Apr 21 04:08:46.011422 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:46.011393 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:08:46.011955 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:46.011929 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:08:46.012416 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:46.012390 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:08:47.028082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:47.028046 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" path="/var/lib/kubelet/pods/6a8605b0-143b-4fc4-8053-a54718ec065f/volumes" Apr 21 04:08:56.012662 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:56.012614 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:08:56.013095 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:08:56.013028 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:09:06.012209 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:06.012171 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:09:06.012661 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:06.012605 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:09:16.012657 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:16.012615 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:09:16.013152 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:16.013128 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:09:26.012344 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:26.012297 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:09:26.012823 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:26.012660 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:09:36.011860 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:36.011815 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:09:36.012236 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:36.012200 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:09:46.013026 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:46.012996 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:09:46.013451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:46.013273 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:09:58.620321 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.620285 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-l98rb_6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b/kserve-container/0.log" Apr 21 04:09:58.798853 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.798815 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t"] Apr 21 04:09:58.799319 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.799286 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" containerID="cri-o://fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa" gracePeriod=30 Apr 21 04:09:58.799563 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.799267 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" containerID="cri-o://4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd" gracePeriod=30 Apr 21 04:09:58.799690 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.799292 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" containerID="cri-o://2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313" gracePeriod=30 Apr 21 04:09:58.861909 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.861872 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9"] Apr 21 04:09:58.862167 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862156 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="storage-initializer" Apr 21 04:09:58.862218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862169 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="storage-initializer" Apr 21 04:09:58.862218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862181 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" Apr 21 04:09:58.862218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862187 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" Apr 21 04:09:58.862218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862195 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" Apr 21 04:09:58.862218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862201 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" Apr 21 04:09:58.862218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862207 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" Apr 21 04:09:58.862218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862213 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" Apr 21 04:09:58.862432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862264 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kserve-container" Apr 21 04:09:58.862432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862274 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="kube-rbac-proxy" Apr 21 04:09:58.862432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.862280 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a8605b0-143b-4fc4-8053-a54718ec065f" containerName="agent" Apr 21 04:09:58.865290 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.865269 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:58.867706 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.867686 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 21 04:09:58.867791 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.867691 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 21 04:09:58.874200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.874170 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9"] Apr 21 04:09:58.928822 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.928798 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb"] Apr 21 04:09:58.929096 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.929071 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kserve-container" containerID="cri-o://8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868" gracePeriod=30 Apr 21 04:09:58.929192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.929104 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kube-rbac-proxy" containerID="cri-o://7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f" gracePeriod=30 Apr 21 04:09:58.942982 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.942949 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.23:8643/healthz\": dial tcp 10.132.0.23:8643: connect: connection refused" Apr 21 04:09:58.953303 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.953282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh46r\" (UniqueName: \"kubernetes.io/projected/27277bf6-c33b-46e1-b722-533703e7c2ff-kube-api-access-sh46r\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:58.953394 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.953346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27277bf6-c33b-46e1-b722-533703e7c2ff-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:58.953437 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.953394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27277bf6-c33b-46e1-b722-533703e7c2ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:58.953482 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:58.953433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27277bf6-c33b-46e1-b722-533703e7c2ff-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.054654 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.054625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27277bf6-c33b-46e1-b722-533703e7c2ff-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.054745 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.054663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27277bf6-c33b-46e1-b722-533703e7c2ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.054745 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.054694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27277bf6-c33b-46e1-b722-533703e7c2ff-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.054745 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.054739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh46r\" (UniqueName: \"kubernetes.io/projected/27277bf6-c33b-46e1-b722-533703e7c2ff-kube-api-access-sh46r\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.055076 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.055050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27277bf6-c33b-46e1-b722-533703e7c2ff-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.055293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.055274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27277bf6-c33b-46e1-b722-533703e7c2ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.057088 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.057057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27277bf6-c33b-46e1-b722-533703e7c2ff-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.062583 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.062551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh46r\" (UniqueName: \"kubernetes.io/projected/27277bf6-c33b-46e1-b722-533703e7c2ff-kube-api-access-sh46r\") pod \"isvc-lightgbm-predictor-bdf964bd-75qr9\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.156444 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.156422 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:09:59.175292 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.175266 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:09:59.231237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.231108 2575 generic.go:358] "Generic (PLEG): container finished" podID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerID="7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f" exitCode=2 Apr 21 04:09:59.231237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.231140 2575 generic.go:358] "Generic (PLEG): container finished" podID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerID="8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868" exitCode=2 Apr 21 04:09:59.231237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.231180 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" event={"ID":"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b","Type":"ContainerDied","Data":"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f"} Apr 21 04:09:59.231237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.231223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" event={"ID":"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b","Type":"ContainerDied","Data":"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868"} Apr 21 04:09:59.231573 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.231242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" event={"ID":"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b","Type":"ContainerDied","Data":"34e72198ea23dfd0f7bebf67078cbea7818d5199266a608111a2ff588c6e5105"} Apr 21 04:09:59.231573 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.231261 2575 scope.go:117] "RemoveContainer" containerID="7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f" Apr 21 04:09:59.231573 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.231455 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb" Apr 21 04:09:59.234973 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.234953 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerID="2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313" exitCode=2 Apr 21 04:09:59.235069 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.235027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerDied","Data":"2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313"} Apr 21 04:09:59.240704 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.240676 2575 scope.go:117] "RemoveContainer" containerID="8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868" Apr 21 04:09:59.248806 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.248787 2575 scope.go:117] "RemoveContainer" containerID="7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f" Apr 21 04:09:59.249056 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:09:59.249037 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f\": container with ID starting with 7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f not found: ID does not exist" containerID="7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f" Apr 21 04:09:59.249127 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.249068 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f"} err="failed to get container status \"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f\": rpc error: code = NotFound desc = could not find container \"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f\": container with ID starting with 7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f not found: ID does not exist" Apr 21 04:09:59.249127 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.249093 2575 scope.go:117] "RemoveContainer" containerID="8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868" Apr 21 04:09:59.249365 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:09:59.249335 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868\": container with ID starting with 8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868 not found: ID does not exist" containerID="8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868" Apr 21 04:09:59.249416 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.249371 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868"} err="failed to get container status \"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868\": rpc error: code = NotFound desc = could not find container \"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868\": container with ID starting with 8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868 not found: ID does not exist" Apr 21 04:09:59.249455 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.249417 2575 scope.go:117] "RemoveContainer" containerID="7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f" Apr 21 04:09:59.249664 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.249645 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f"} err="failed to get container status \"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f\": rpc error: code = NotFound desc = could not find container \"7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f\": container with ID starting with 7f7a5fe625d07e54669f3a75f5ee75b46829b191c74b613f2de37b274f45355f not found: ID does not exist" Apr 21 04:09:59.249708 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.249672 2575 scope.go:117] "RemoveContainer" containerID="8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868" Apr 21 04:09:59.249959 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.249937 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868"} err="failed to get container status \"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868\": rpc error: code = NotFound desc = could not find container \"8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868\": container with ID starting with 8044d7fbcbbddb2bd0c653cf802e28494264c92b73e6b05f738835db8bbe9868 not found: ID does not exist" Apr 21 04:09:59.256136 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.256112 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls\") pod \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " Apr 21 04:09:59.256244 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.256202 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-message-dumper-kube-rbac-proxy-sar-config\") pod \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " Apr 21 04:09:59.256306 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.256240 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvb29\" (UniqueName: \"kubernetes.io/projected/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-kube-api-access-zvb29\") pod \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\" (UID: \"6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b\") " Apr 21 04:09:59.256585 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.256559 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" (UID: "6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:09:59.258727 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.258705 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-kube-api-access-zvb29" (OuterVolumeSpecName: "kube-api-access-zvb29") pod "6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" (UID: "6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b"). InnerVolumeSpecName "kube-api-access-zvb29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:09:59.258858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.258705 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" (UID: "6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:09:59.294335 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.294312 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9"] Apr 21 04:09:59.296657 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:09:59.296631 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27277bf6_c33b_46e1_b722_533703e7c2ff.slice/crio-c94749961d9813256f4d1193a7f57a368ca463def9d11a881edc6d0aa42dac4b WatchSource:0}: Error finding container c94749961d9813256f4d1193a7f57a368ca463def9d11a881edc6d0aa42dac4b: Status 404 returned error can't find the container with id c94749961d9813256f4d1193a7f57a368ca463def9d11a881edc6d0aa42dac4b Apr 21 04:09:59.357050 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.357028 2575 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:09:59.357050 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.357051 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvb29\" (UniqueName: \"kubernetes.io/projected/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-kube-api-access-zvb29\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:09:59.357169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.357062 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:09:59.552844 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.552812 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb"] Apr 21 04:09:59.554586 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:09:59.554561 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-l98rb"] Apr 21 04:10:00.238674 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:00.238644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerStarted","Data":"9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9"} Apr 21 04:10:00.238674 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:00.238677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerStarted","Data":"c94749961d9813256f4d1193a7f57a368ca463def9d11a881edc6d0aa42dac4b"} Apr 21 04:10:01.008070 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:01.008031 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 21 04:10:01.027731 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:01.027704 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" path="/var/lib/kubelet/pods/6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b/volumes" Apr 21 04:10:03.250368 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:03.250336 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerID="4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd" exitCode=0 Apr 21 04:10:03.250732 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:03.250423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerDied","Data":"4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd"} Apr 21 04:10:04.256655 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:04.256617 2575 generic.go:358] "Generic (PLEG): container finished" podID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerID="9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9" exitCode=0 Apr 21 04:10:04.257063 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:04.256663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerDied","Data":"9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9"} Apr 21 04:10:06.008180 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:06.008117 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 21 04:10:06.012510 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:06.012467 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:10:06.012995 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:06.012930 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:10:11.007709 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:11.007632 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 21 04:10:11.008163 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:11.007748 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:10:11.282031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:11.281952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerStarted","Data":"160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369"} Apr 21 04:10:11.282031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:11.281985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerStarted","Data":"63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2"} Apr 21 04:10:11.282216 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:11.282199 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:10:11.301849 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:11.301793 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podStartSLOduration=6.927421047 podStartE2EDuration="13.301779542s" podCreationTimestamp="2026-04-21 04:09:58 +0000 UTC" firstStartedPulling="2026-04-21 04:10:04.258094606 +0000 UTC m=+789.922684670" lastFinishedPulling="2026-04-21 04:10:10.632453098 +0000 UTC m=+796.297043165" observedRunningTime="2026-04-21 04:10:11.300280306 +0000 UTC m=+796.964870391" watchObservedRunningTime="2026-04-21 04:10:11.301779542 +0000 UTC m=+796.966369628" Apr 21 04:10:12.284386 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:12.284350 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:10:12.285708 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:12.285680 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:10:13.287374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:13.287332 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:10:16.007664 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:16.007583 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 21 04:10:16.011952 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:16.011929 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:10:16.012261 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:16.012238 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:10:18.291318 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:18.291292 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:10:18.291928 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:18.291897 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:10:21.008312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:21.008266 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 21 04:10:26.008275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:26.008234 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 21 04:10:26.012566 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:26.012542 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 21 04:10:26.012671 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:26.012659 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:10:26.012903 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:26.012879 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:10:26.012998 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:26.012984 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:10:28.292604 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:28.292565 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:10:28.941601 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:28.941580 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:10:29.073221 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073149 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c6245e6-0f47-41dd-8536-df98877dceb9-isvc-logger-kube-rbac-proxy-sar-config\") pod \"4c6245e6-0f47-41dd-8536-df98877dceb9\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " Apr 21 04:10:29.073353 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073229 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c6245e6-0f47-41dd-8536-df98877dceb9-kserve-provision-location\") pod \"4c6245e6-0f47-41dd-8536-df98877dceb9\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " Apr 21 04:10:29.073353 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073325 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl8k9\" (UniqueName: \"kubernetes.io/projected/4c6245e6-0f47-41dd-8536-df98877dceb9-kube-api-access-bl8k9\") pod \"4c6245e6-0f47-41dd-8536-df98877dceb9\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " Apr 21 04:10:29.073459 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073379 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c6245e6-0f47-41dd-8536-df98877dceb9-proxy-tls\") pod \"4c6245e6-0f47-41dd-8536-df98877dceb9\" (UID: \"4c6245e6-0f47-41dd-8536-df98877dceb9\") " Apr 21 04:10:29.073512 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073452 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c6245e6-0f47-41dd-8536-df98877dceb9-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "4c6245e6-0f47-41dd-8536-df98877dceb9" (UID: "4c6245e6-0f47-41dd-8536-df98877dceb9"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:10:29.073512 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073462 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c6245e6-0f47-41dd-8536-df98877dceb9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4c6245e6-0f47-41dd-8536-df98877dceb9" (UID: "4c6245e6-0f47-41dd-8536-df98877dceb9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:10:29.073617 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073577 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c6245e6-0f47-41dd-8536-df98877dceb9-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:10:29.073617 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.073591 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c6245e6-0f47-41dd-8536-df98877dceb9-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:10:29.075317 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.075292 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6245e6-0f47-41dd-8536-df98877dceb9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4c6245e6-0f47-41dd-8536-df98877dceb9" (UID: "4c6245e6-0f47-41dd-8536-df98877dceb9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:10:29.075427 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.075401 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6245e6-0f47-41dd-8536-df98877dceb9-kube-api-access-bl8k9" (OuterVolumeSpecName: "kube-api-access-bl8k9") pod "4c6245e6-0f47-41dd-8536-df98877dceb9" (UID: "4c6245e6-0f47-41dd-8536-df98877dceb9"). InnerVolumeSpecName "kube-api-access-bl8k9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:10:29.174868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.174830 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bl8k9\" (UniqueName: \"kubernetes.io/projected/4c6245e6-0f47-41dd-8536-df98877dceb9-kube-api-access-bl8k9\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:10:29.174868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.174867 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c6245e6-0f47-41dd-8536-df98877dceb9-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:10:29.335131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.335060 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerID="fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa" exitCode=0 Apr 21 04:10:29.335469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.335141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerDied","Data":"fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa"} Apr 21 04:10:29.335469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.335156 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" Apr 21 04:10:29.335469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.335183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t" event={"ID":"4c6245e6-0f47-41dd-8536-df98877dceb9","Type":"ContainerDied","Data":"944b62e81fec0891c2276318fa475b440b7bd512fdec053dd39c8139bccb7871"} Apr 21 04:10:29.335469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.335204 2575 scope.go:117] "RemoveContainer" containerID="fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa" Apr 21 04:10:29.342731 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.342709 2575 scope.go:117] "RemoveContainer" containerID="2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313" Apr 21 04:10:29.349683 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.349668 2575 scope.go:117] "RemoveContainer" containerID="4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd" Apr 21 04:10:29.356603 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.356584 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t"] Apr 21 04:10:29.357239 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.357227 2575 scope.go:117] "RemoveContainer" containerID="92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45" Apr 21 04:10:29.361301 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.361281 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-sc54t"] Apr 21 04:10:29.364609 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.364593 2575 scope.go:117] "RemoveContainer" containerID="fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa" Apr 21 04:10:29.364893 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:10:29.364872 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa\": container with ID starting with fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa not found: ID does not exist" containerID="fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa" Apr 21 04:10:29.364944 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.364901 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa"} err="failed to get container status \"fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa\": rpc error: code = NotFound desc = could not find container \"fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa\": container with ID starting with fa23d77694163e1a222767a2057401c66bf7e14f6bbf568b69f748699c44c0aa not found: ID does not exist" Apr 21 04:10:29.364944 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.364918 2575 scope.go:117] "RemoveContainer" containerID="2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313" Apr 21 04:10:29.365150 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:10:29.365132 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313\": container with ID starting with 2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313 not found: ID does not exist" containerID="2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313" Apr 21 04:10:29.365194 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.365156 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313"} err="failed to get container status \"2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313\": rpc error: code = NotFound desc = could not find container \"2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313\": container with ID starting with 2ef1b35d6c7c922f42c20b1362614fafc11aa21b14b0ca1ac032383eda63f313 not found: ID does not exist" Apr 21 04:10:29.365194 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.365174 2575 scope.go:117] "RemoveContainer" containerID="4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd" Apr 21 04:10:29.365370 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:10:29.365352 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd\": container with ID starting with 4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd not found: ID does not exist" containerID="4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd" Apr 21 04:10:29.365425 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.365378 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd"} err="failed to get container status \"4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd\": rpc error: code = NotFound desc = could not find container \"4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd\": container with ID starting with 4b9558b446fc10b86f2e6953dee7be78df75986c00149cfa2239ad5f872220cd not found: ID does not exist" Apr 21 04:10:29.365425 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.365400 2575 scope.go:117] "RemoveContainer" containerID="92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45" Apr 21 04:10:29.365625 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:10:29.365610 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45\": container with ID starting with 92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45 not found: ID does not exist" containerID="92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45" Apr 21 04:10:29.365662 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:29.365629 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45"} err="failed to get container status \"92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45\": rpc error: code = NotFound desc = could not find container \"92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45\": container with ID starting with 92fbfb929e46bf556a0f24e55b72db6bca1a56ffa7625b814944af7dd46d2c45 not found: ID does not exist" Apr 21 04:10:31.028178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:31.028145 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" path="/var/lib/kubelet/pods/4c6245e6-0f47-41dd-8536-df98877dceb9/volumes" Apr 21 04:10:38.292038 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:38.291994 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:10:48.292262 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:48.292222 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:10:58.292135 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:10:58.292095 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:11:08.292080 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:08.292039 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:11:18.292648 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:18.292604 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 21 04:11:28.292952 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:28.292916 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:11:28.945809 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:28.945770 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9"] Apr 21 04:11:28.946123 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:28.946097 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" containerID="cri-o://63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2" gracePeriod=30 Apr 21 04:11:28.946221 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:28.946144 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kube-rbac-proxy" containerID="cri-o://160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369" gracePeriod=30 Apr 21 04:11:29.074469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074442 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt"] Apr 21 04:11:29.074707 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074696 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" Apr 21 04:11:29.074752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074708 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" Apr 21 04:11:29.074752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074724 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kserve-container" Apr 21 04:11:29.074752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074729 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kserve-container" Apr 21 04:11:29.074752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074741 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kube-rbac-proxy" Apr 21 04:11:29.074752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074747 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kube-rbac-proxy" Apr 21 04:11:29.074752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074772 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074779 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074785 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="storage-initializer" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074792 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="storage-initializer" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074802 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074807 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074865 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kserve-container" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074873 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kube-rbac-proxy" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074881 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="kserve-container" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074888 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b08d9d7-a5e2-48d7-abbb-5eb4ae2f5a0b" containerName="kube-rbac-proxy" Apr 21 04:11:29.074947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.074893 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c6245e6-0f47-41dd-8536-df98877dceb9" containerName="agent" Apr 21 04:11:29.077693 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.077672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.079903 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.079883 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 21 04:11:29.080004 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.079985 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:11:29.085856 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.085833 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt"] Apr 21 04:11:29.179133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.179110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.179248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.179140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87nbv\" (UniqueName: \"kubernetes.io/projected/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kube-api-access-87nbv\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.179248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.179167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.179338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.179264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.280595 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.280537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.280595 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.280576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.280728 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.280600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87nbv\" (UniqueName: \"kubernetes.io/projected/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kube-api-access-87nbv\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.280728 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.280632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.280728 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:11:29.280697 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-serving-cert: secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 21 04:11:29.280892 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:11:29.280783 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls podName:dd5cbef3-0d90-4b6f-a5a8-1308b345c43c nodeName:}" failed. No retries permitted until 2026-04-21 04:11:29.780741902 +0000 UTC m=+875.445331973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls") pod "isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" (UID: "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c") : secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 21 04:11:29.281060 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.281036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.281278 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.281259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.289208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.289186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87nbv\" (UniqueName: \"kubernetes.io/projected/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kube-api-access-87nbv\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.497626 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.497595 2575 generic.go:358] "Generic (PLEG): container finished" podID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerID="160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369" exitCode=2 Apr 21 04:11:29.498005 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.497669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerDied","Data":"160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369"} Apr 21 04:11:29.783695 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.783662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.785987 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.785969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:29.988161 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:29.988134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:30.105341 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:30.105319 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt"] Apr 21 04:11:30.107458 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:11:30.107426 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5cbef3_0d90_4b6f_a5a8_1308b345c43c.slice/crio-6660e9e0a656362a575f8ad222451d15c06dc95389fc94912c8eb65c756e03d5 WatchSource:0}: Error finding container 6660e9e0a656362a575f8ad222451d15c06dc95389fc94912c8eb65c756e03d5: Status 404 returned error can't find the container with id 6660e9e0a656362a575f8ad222451d15c06dc95389fc94912c8eb65c756e03d5 Apr 21 04:11:30.502543 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:30.502510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerStarted","Data":"7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0"} Apr 21 04:11:30.502940 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:30.502548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerStarted","Data":"6660e9e0a656362a575f8ad222451d15c06dc95389fc94912c8eb65c756e03d5"} Apr 21 04:11:32.890318 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:32.890296 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:11:33.006067 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.006009 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh46r\" (UniqueName: \"kubernetes.io/projected/27277bf6-c33b-46e1-b722-533703e7c2ff-kube-api-access-sh46r\") pod \"27277bf6-c33b-46e1-b722-533703e7c2ff\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " Apr 21 04:11:33.006067 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.006038 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27277bf6-c33b-46e1-b722-533703e7c2ff-kserve-provision-location\") pod \"27277bf6-c33b-46e1-b722-533703e7c2ff\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " Apr 21 04:11:33.006067 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.006065 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27277bf6-c33b-46e1-b722-533703e7c2ff-proxy-tls\") pod \"27277bf6-c33b-46e1-b722-533703e7c2ff\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " Apr 21 04:11:33.006319 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.006101 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27277bf6-c33b-46e1-b722-533703e7c2ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"27277bf6-c33b-46e1-b722-533703e7c2ff\" (UID: \"27277bf6-c33b-46e1-b722-533703e7c2ff\") " Apr 21 04:11:33.006400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.006375 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27277bf6-c33b-46e1-b722-533703e7c2ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "27277bf6-c33b-46e1-b722-533703e7c2ff" (UID: "27277bf6-c33b-46e1-b722-533703e7c2ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:11:33.006483 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.006460 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27277bf6-c33b-46e1-b722-533703e7c2ff-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "27277bf6-c33b-46e1-b722-533703e7c2ff" (UID: "27277bf6-c33b-46e1-b722-533703e7c2ff"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:11:33.008094 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.008064 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27277bf6-c33b-46e1-b722-533703e7c2ff-kube-api-access-sh46r" (OuterVolumeSpecName: "kube-api-access-sh46r") pod "27277bf6-c33b-46e1-b722-533703e7c2ff" (UID: "27277bf6-c33b-46e1-b722-533703e7c2ff"). InnerVolumeSpecName "kube-api-access-sh46r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:11:33.008195 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.008120 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27277bf6-c33b-46e1-b722-533703e7c2ff-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "27277bf6-c33b-46e1-b722-533703e7c2ff" (UID: "27277bf6-c33b-46e1-b722-533703e7c2ff"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:11:33.106549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.106526 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27277bf6-c33b-46e1-b722-533703e7c2ff-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:11:33.106549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.106548 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27277bf6-c33b-46e1-b722-533703e7c2ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:11:33.106671 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.106558 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sh46r\" (UniqueName: \"kubernetes.io/projected/27277bf6-c33b-46e1-b722-533703e7c2ff-kube-api-access-sh46r\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:11:33.106671 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.106567 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27277bf6-c33b-46e1-b722-533703e7c2ff-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:11:33.511888 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.511853 2575 generic.go:358] "Generic (PLEG): container finished" podID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerID="63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2" exitCode=0 Apr 21 04:11:33.512031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.511903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerDied","Data":"63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2"} Apr 21 04:11:33.512031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.511926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" event={"ID":"27277bf6-c33b-46e1-b722-533703e7c2ff","Type":"ContainerDied","Data":"c94749961d9813256f4d1193a7f57a368ca463def9d11a881edc6d0aa42dac4b"} Apr 21 04:11:33.512031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.511932 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9" Apr 21 04:11:33.512031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.511941 2575 scope.go:117] "RemoveContainer" containerID="160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369" Apr 21 04:11:33.519384 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.519368 2575 scope.go:117] "RemoveContainer" containerID="63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2" Apr 21 04:11:33.526194 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.526171 2575 scope.go:117] "RemoveContainer" containerID="9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9" Apr 21 04:11:33.528464 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.528150 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9"] Apr 21 04:11:33.529633 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.529607 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-75qr9"] Apr 21 04:11:33.533270 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.533256 2575 scope.go:117] "RemoveContainer" containerID="160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369" Apr 21 04:11:33.533497 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:11:33.533480 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369\": container with ID starting with 160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369 not found: ID does not exist" containerID="160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369" Apr 21 04:11:33.533543 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.533505 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369"} err="failed to get container status \"160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369\": rpc error: code = NotFound desc = could not find container \"160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369\": container with ID starting with 160c7533f109af89e8d2a45bee431778eafe8360d701e78be5d1753ca21a7369 not found: ID does not exist" Apr 21 04:11:33.533543 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.533521 2575 scope.go:117] "RemoveContainer" containerID="63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2" Apr 21 04:11:33.533710 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:11:33.533695 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2\": container with ID starting with 63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2 not found: ID does not exist" containerID="63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2" Apr 21 04:11:33.533769 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.533712 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2"} err="failed to get container status \"63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2\": rpc error: code = NotFound desc = could not find container \"63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2\": container with ID starting with 63c38edec43c6b5ce2f4a8a81097a4f416c43333b14560d9dd96507cc7876da2 not found: ID does not exist" Apr 21 04:11:33.533769 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.533724 2575 scope.go:117] "RemoveContainer" containerID="9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9" Apr 21 04:11:33.534003 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:11:33.533986 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9\": container with ID starting with 9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9 not found: ID does not exist" containerID="9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9" Apr 21 04:11:33.534041 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:33.534006 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9"} err="failed to get container status \"9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9\": rpc error: code = NotFound desc = could not find container \"9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9\": container with ID starting with 9fb2e33f5c9926f1b97e4f5ef47f2c2e9aa69b792a6f17d4633e70ed4ef917b9 not found: ID does not exist" Apr 21 04:11:34.516328 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:34.516300 2575 generic.go:358] "Generic (PLEG): container finished" podID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerID="7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0" exitCode=0 Apr 21 04:11:34.516701 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:34.516355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerDied","Data":"7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0"} Apr 21 04:11:35.028091 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:35.028062 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" path="/var/lib/kubelet/pods/27277bf6-c33b-46e1-b722-533703e7c2ff/volumes" Apr 21 04:11:35.520061 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:35.520027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerStarted","Data":"dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06"} Apr 21 04:11:35.520421 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:35.520070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerStarted","Data":"6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5"} Apr 21 04:11:35.520421 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:35.520345 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:35.538532 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:35.538492 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podStartSLOduration=6.53847953 podStartE2EDuration="6.53847953s" podCreationTimestamp="2026-04-21 04:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:11:35.536870936 +0000 UTC m=+881.201461043" watchObservedRunningTime="2026-04-21 04:11:35.53847953 +0000 UTC m=+881.203069616" Apr 21 04:11:36.522543 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:36.522513 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:36.523786 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:36.523744 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:11:37.525470 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:37.525431 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:11:42.530444 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:42.530415 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:11:42.530923 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:42.530897 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:11:52.531527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:11:52.531442 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:12:02.531379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:02.531342 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:12:12.531472 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:12.531432 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:12:22.531151 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:22.531111 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:12:32.530857 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:32.530813 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:12:42.531455 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:42.531416 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:12:52.531484 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:52.531445 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:12:59.396202 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.396166 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt"] Apr 21 04:12:59.396695 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.396477 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" containerID="cri-o://6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5" gracePeriod=30 Apr 21 04:12:59.396695 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.396513 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kube-rbac-proxy" containerID="cri-o://dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06" gracePeriod=30 Apr 21 04:12:59.499999 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.499963 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf"] Apr 21 04:12:59.500294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500277 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="storage-initializer" Apr 21 04:12:59.500379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500296 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="storage-initializer" Apr 21 04:12:59.500379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500321 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kube-rbac-proxy" Apr 21 04:12:59.500379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500331 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kube-rbac-proxy" Apr 21 04:12:59.500379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500353 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" Apr 21 04:12:59.500379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500364 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" Apr 21 04:12:59.500686 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500433 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kserve-container" Apr 21 04:12:59.500686 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.500445 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="27277bf6-c33b-46e1-b722-533703e7c2ff" containerName="kube-rbac-proxy" Apr 21 04:12:59.503678 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.503659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.506000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.505980 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 21 04:12:59.506000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.505990 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:12:59.513425 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.513402 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf"] Apr 21 04:12:59.609835 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.609806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/008ee74a-8022-47e7-b37b-18d45c4f78b3-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.609973 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.609852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/008ee74a-8022-47e7-b37b-18d45c4f78b3-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.609973 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.609876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.609973 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.609920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwhz\" (UniqueName: \"kubernetes.io/projected/008ee74a-8022-47e7-b37b-18d45c4f78b3-kube-api-access-4zwhz\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.710807 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.710724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.710807 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.710770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwhz\" (UniqueName: \"kubernetes.io/projected/008ee74a-8022-47e7-b37b-18d45c4f78b3-kube-api-access-4zwhz\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.710954 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.710828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/008ee74a-8022-47e7-b37b-18d45c4f78b3-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.710954 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.710857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/008ee74a-8022-47e7-b37b-18d45c4f78b3-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.710954 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:12:59.710885 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-serving-cert: secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 21 04:12:59.711092 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:12:59.710956 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls podName:008ee74a-8022-47e7-b37b-18d45c4f78b3 nodeName:}" failed. No retries permitted until 2026-04-21 04:13:00.2109349 +0000 UTC m=+965.875524978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls") pod "isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" (UID: "008ee74a-8022-47e7-b37b-18d45c4f78b3") : secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 21 04:12:59.711208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.711191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/008ee74a-8022-47e7-b37b-18d45c4f78b3-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.711547 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.711530 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/008ee74a-8022-47e7-b37b-18d45c4f78b3-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.719087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.719060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwhz\" (UniqueName: \"kubernetes.io/projected/008ee74a-8022-47e7-b37b-18d45c4f78b3-kube-api-access-4zwhz\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:12:59.743899 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.743874 2575 generic.go:358] "Generic (PLEG): container finished" podID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerID="dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06" exitCode=2 Apr 21 04:12:59.743987 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:12:59.743939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerDied","Data":"dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06"} Apr 21 04:13:00.215132 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:00.215101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:13:00.217326 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:00.217306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:13:00.414226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:00.414195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:13:00.535498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:00.535472 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf"] Apr 21 04:13:00.538017 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:13:00.537986 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008ee74a_8022_47e7_b37b_18d45c4f78b3.slice/crio-5605c4d316e455585ea47b893f273e6205ad7fe1b169583057f2918946cf1d2f WatchSource:0}: Error finding container 5605c4d316e455585ea47b893f273e6205ad7fe1b169583057f2918946cf1d2f: Status 404 returned error can't find the container with id 5605c4d316e455585ea47b893f273e6205ad7fe1b169583057f2918946cf1d2f Apr 21 04:13:00.748244 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:00.748168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerStarted","Data":"777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a"} Apr 21 04:13:00.748244 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:00.748204 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerStarted","Data":"5605c4d316e455585ea47b893f273e6205ad7fe1b169583057f2918946cf1d2f"} Apr 21 04:13:02.525939 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:02.525890 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.26:8643/healthz\": dial tcp 10.132.0.26:8643: connect: connection refused" Apr 21 04:13:02.531210 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:02.531184 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 21 04:13:03.527204 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.527182 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:13:03.540436 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.540412 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " Apr 21 04:13:03.540528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.540484 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87nbv\" (UniqueName: \"kubernetes.io/projected/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kube-api-access-87nbv\") pod \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " Apr 21 04:13:03.540528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.540512 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls\") pod \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " Apr 21 04:13:03.540631 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.540536 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kserve-provision-location\") pod \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\" (UID: \"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c\") " Apr 21 04:13:03.540706 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.540679 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" (UID: "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:13:03.540945 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.540924 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" (UID: "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:13:03.542486 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.542451 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kube-api-access-87nbv" (OuterVolumeSpecName: "kube-api-access-87nbv") pod "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" (UID: "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c"). InnerVolumeSpecName "kube-api-access-87nbv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:13:03.542486 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.542470 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" (UID: "dd5cbef3-0d90-4b6f-a5a8-1308b345c43c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:13:03.641849 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.641827 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87nbv\" (UniqueName: \"kubernetes.io/projected/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kube-api-access-87nbv\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:13:03.641965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.641851 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:13:03.641965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.641863 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:13:03.641965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.641873 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:13:03.757709 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.757679 2575 generic.go:358] "Generic (PLEG): container finished" podID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerID="6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5" exitCode=0 Apr 21 04:13:03.757843 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.757731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerDied","Data":"6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5"} Apr 21 04:13:03.757843 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.757784 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" Apr 21 04:13:03.757843 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.757797 2575 scope.go:117] "RemoveContainer" containerID="dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06" Apr 21 04:13:03.757997 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.757785 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt" event={"ID":"dd5cbef3-0d90-4b6f-a5a8-1308b345c43c","Type":"ContainerDied","Data":"6660e9e0a656362a575f8ad222451d15c06dc95389fc94912c8eb65c756e03d5"} Apr 21 04:13:03.765822 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.765661 2575 scope.go:117] "RemoveContainer" containerID="6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5" Apr 21 04:13:03.773674 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.773655 2575 scope.go:117] "RemoveContainer" containerID="7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0" Apr 21 04:13:03.778690 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.778667 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt"] Apr 21 04:13:03.780377 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.780361 2575 scope.go:117] "RemoveContainer" containerID="dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06" Apr 21 04:13:03.781024 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:13:03.780940 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06\": container with ID starting with dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06 not found: ID does not exist" containerID="dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06" Apr 21 04:13:03.781024 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.780984 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06"} err="failed to get container status \"dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06\": rpc error: code = NotFound desc = could not find container \"dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06\": container with ID starting with dbece8ce6c1e983be3c4cc7d75c4047a41e99a351e72b75d17e9aaa7c5e58c06 not found: ID does not exist" Apr 21 04:13:03.781024 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.781013 2575 scope.go:117] "RemoveContainer" containerID="6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5" Apr 21 04:13:03.781485 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:13:03.781384 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5\": container with ID starting with 6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5 not found: ID does not exist" containerID="6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5" Apr 21 04:13:03.781485 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.781418 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5"} err="failed to get container status \"6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5\": rpc error: code = NotFound desc = could not find container \"6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5\": container with ID starting with 6b7419c6f1525c9a08ce0ce9d870d7e26a6206767e853d9b7a43d97a0a49d7d5 not found: ID does not exist" Apr 21 04:13:03.781485 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.781440 2575 scope.go:117] "RemoveContainer" containerID="7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0" Apr 21 04:13:03.781884 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:13:03.781858 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0\": container with ID starting with 7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0 not found: ID does not exist" containerID="7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0" Apr 21 04:13:03.781884 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.781893 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0"} err="failed to get container status \"7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0\": rpc error: code = NotFound desc = could not find container \"7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0\": container with ID starting with 7474bf43a5f904c9a19fd987b8d6de45b541064b0fe573a5ce29c48c40c8e7e0 not found: ID does not exist" Apr 21 04:13:03.782872 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:03.782854 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-ddrlt"] Apr 21 04:13:04.762267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:04.762191 2575 generic.go:358] "Generic (PLEG): container finished" podID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerID="777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a" exitCode=0 Apr 21 04:13:04.762267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:04.762232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerDied","Data":"777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a"} Apr 21 04:13:05.027253 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:13:05.027187 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" path="/var/lib/kubelet/pods/dd5cbef3-0d90-4b6f-a5a8-1308b345c43c/volumes" Apr 21 04:15:20.440890 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:20.440865 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:15:21.174535 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:21.174505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerStarted","Data":"5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf"} Apr 21 04:15:21.174535 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:21.174540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerStarted","Data":"f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f"} Apr 21 04:15:21.174880 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:21.174667 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:15:21.201254 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:21.201197 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" podStartSLOduration=6.668279864 podStartE2EDuration="2m22.201178416s" podCreationTimestamp="2026-04-21 04:12:59 +0000 UTC" firstStartedPulling="2026-04-21 04:13:04.763309426 +0000 UTC m=+970.427899490" lastFinishedPulling="2026-04-21 04:15:20.296207957 +0000 UTC m=+1105.960798042" observedRunningTime="2026-04-21 04:15:21.198567467 +0000 UTC m=+1106.863157563" watchObservedRunningTime="2026-04-21 04:15:21.201178416 +0000 UTC m=+1106.865768503" Apr 21 04:15:22.177498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:22.177468 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:15:28.185425 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:28.185395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:15:58.188881 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:58.188848 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:15:59.658452 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.658416 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf"] Apr 21 04:15:59.658901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.658683 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kserve-container" containerID="cri-o://f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f" gracePeriod=30 Apr 21 04:15:59.659284 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.659244 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kube-rbac-proxy" containerID="cri-o://5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf" gracePeriod=30 Apr 21 04:15:59.759367 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759334 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5"] Apr 21 04:15:59.759671 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759657 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="storage-initializer" Apr 21 04:15:59.759718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759672 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="storage-initializer" Apr 21 04:15:59.759718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759681 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kube-rbac-proxy" Apr 21 04:15:59.759718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759687 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kube-rbac-proxy" Apr 21 04:15:59.759718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759697 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" Apr 21 04:15:59.759718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759703 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" Apr 21 04:15:59.759888 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759769 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kube-rbac-proxy" Apr 21 04:15:59.759888 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.759778 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5cbef3-0d90-4b6f-a5a8-1308b345c43c" containerName="kserve-container" Apr 21 04:15:59.770533 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.770504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:15:59.773084 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.773048 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5"] Apr 21 04:15:59.773211 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.773094 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 21 04:15:59.773211 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.773166 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 21 04:15:59.900875 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.900841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5chsc\" (UniqueName: \"kubernetes.io/projected/cba5af0e-c7e3-435d-bd11-635a21ac9446-kube-api-access-5chsc\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:15:59.900875 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.900875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba5af0e-c7e3-435d-bd11-635a21ac9446-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:15:59.901106 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.900897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba5af0e-c7e3-435d-bd11-635a21ac9446-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:15:59.901106 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:15:59.900992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba5af0e-c7e3-435d-bd11-635a21ac9446-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.001710 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.001611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba5af0e-c7e3-435d-bd11-635a21ac9446-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.001710 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.001699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5chsc\" (UniqueName: \"kubernetes.io/projected/cba5af0e-c7e3-435d-bd11-635a21ac9446-kube-api-access-5chsc\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.001985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.001730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba5af0e-c7e3-435d-bd11-635a21ac9446-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.001985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.001781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba5af0e-c7e3-435d-bd11-635a21ac9446-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.002149 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.002124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba5af0e-c7e3-435d-bd11-635a21ac9446-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.002342 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.002324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba5af0e-c7e3-435d-bd11-635a21ac9446-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.004149 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.004126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba5af0e-c7e3-435d-bd11-635a21ac9446-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.009363 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.009336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5chsc\" (UniqueName: \"kubernetes.io/projected/cba5af0e-c7e3-435d-bd11-635a21ac9446-kube-api-access-5chsc\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.082835 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.082807 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:00.201858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.201725 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5"] Apr 21 04:16:00.204810 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:16:00.204781 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba5af0e_c7e3_435d_bd11_635a21ac9446.slice/crio-0678fde42abae61e2fd45e3a9a976885675c4d804d78bebeb009aefa66b060fd WatchSource:0}: Error finding container 0678fde42abae61e2fd45e3a9a976885675c4d804d78bebeb009aefa66b060fd: Status 404 returned error can't find the container with id 0678fde42abae61e2fd45e3a9a976885675c4d804d78bebeb009aefa66b060fd Apr 21 04:16:00.278307 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.278270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerStarted","Data":"b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9"} Apr 21 04:16:00.278579 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.278315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerStarted","Data":"0678fde42abae61e2fd45e3a9a976885675c4d804d78bebeb009aefa66b060fd"} Apr 21 04:16:00.280403 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.280379 2575 generic.go:358] "Generic (PLEG): container finished" podID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerID="5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf" exitCode=2 Apr 21 04:16:00.280499 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.280455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerDied","Data":"5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf"} Apr 21 04:16:00.595486 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.595453 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:16:00.709364 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.709330 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zwhz\" (UniqueName: \"kubernetes.io/projected/008ee74a-8022-47e7-b37b-18d45c4f78b3-kube-api-access-4zwhz\") pod \"008ee74a-8022-47e7-b37b-18d45c4f78b3\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " Apr 21 04:16:00.709729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.709385 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls\") pod \"008ee74a-8022-47e7-b37b-18d45c4f78b3\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " Apr 21 04:16:00.709729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.709458 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/008ee74a-8022-47e7-b37b-18d45c4f78b3-kserve-provision-location\") pod \"008ee74a-8022-47e7-b37b-18d45c4f78b3\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " Apr 21 04:16:00.709729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.709517 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/008ee74a-8022-47e7-b37b-18d45c4f78b3-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"008ee74a-8022-47e7-b37b-18d45c4f78b3\" (UID: \"008ee74a-8022-47e7-b37b-18d45c4f78b3\") " Apr 21 04:16:00.709934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.709837 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/008ee74a-8022-47e7-b37b-18d45c4f78b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "008ee74a-8022-47e7-b37b-18d45c4f78b3" (UID: "008ee74a-8022-47e7-b37b-18d45c4f78b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:16:00.709934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.709878 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/008ee74a-8022-47e7-b37b-18d45c4f78b3-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "008ee74a-8022-47e7-b37b-18d45c4f78b3" (UID: "008ee74a-8022-47e7-b37b-18d45c4f78b3"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:16:00.711400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.711376 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "008ee74a-8022-47e7-b37b-18d45c4f78b3" (UID: "008ee74a-8022-47e7-b37b-18d45c4f78b3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:16:00.711815 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.711796 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008ee74a-8022-47e7-b37b-18d45c4f78b3-kube-api-access-4zwhz" (OuterVolumeSpecName: "kube-api-access-4zwhz") pod "008ee74a-8022-47e7-b37b-18d45c4f78b3" (UID: "008ee74a-8022-47e7-b37b-18d45c4f78b3"). InnerVolumeSpecName "kube-api-access-4zwhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:16:00.810439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.810377 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/008ee74a-8022-47e7-b37b-18d45c4f78b3-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:00.810439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.810402 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/008ee74a-8022-47e7-b37b-18d45c4f78b3-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:00.810439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.810414 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zwhz\" (UniqueName: \"kubernetes.io/projected/008ee74a-8022-47e7-b37b-18d45c4f78b3-kube-api-access-4zwhz\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:00.810439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:00.810423 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/008ee74a-8022-47e7-b37b-18d45c4f78b3-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:01.284737 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.284706 2575 generic.go:358] "Generic (PLEG): container finished" podID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerID="f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f" exitCode=0 Apr 21 04:16:01.284928 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.284792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerDied","Data":"f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f"} Apr 21 04:16:01.284928 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.284816 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" Apr 21 04:16:01.284928 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.284832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf" event={"ID":"008ee74a-8022-47e7-b37b-18d45c4f78b3","Type":"ContainerDied","Data":"5605c4d316e455585ea47b893f273e6205ad7fe1b169583057f2918946cf1d2f"} Apr 21 04:16:01.284928 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.284850 2575 scope.go:117] "RemoveContainer" containerID="5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf" Apr 21 04:16:01.292436 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.292416 2575 scope.go:117] "RemoveContainer" containerID="f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f" Apr 21 04:16:01.299192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.299177 2575 scope.go:117] "RemoveContainer" containerID="777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a" Apr 21 04:16:01.301435 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.301415 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf"] Apr 21 04:16:01.306236 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.306215 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-4gskf"] Apr 21 04:16:01.306702 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.306688 2575 scope.go:117] "RemoveContainer" containerID="5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf" Apr 21 04:16:01.306982 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:16:01.306965 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf\": container with ID starting with 5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf not found: ID does not exist" containerID="5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf" Apr 21 04:16:01.307046 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.306990 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf"} err="failed to get container status \"5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf\": rpc error: code = NotFound desc = could not find container \"5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf\": container with ID starting with 5f8d1a985da7b5d153163d4a1f964c9b0a5ba18fd201c1e090196d3c5bf62bdf not found: ID does not exist" Apr 21 04:16:01.307046 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.307009 2575 scope.go:117] "RemoveContainer" containerID="f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f" Apr 21 04:16:01.307264 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:16:01.307248 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f\": container with ID starting with f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f not found: ID does not exist" containerID="f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f" Apr 21 04:16:01.307311 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.307271 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f"} err="failed to get container status \"f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f\": rpc error: code = NotFound desc = could not find container \"f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f\": container with ID starting with f7708228476f61facb08d5242ca629cd0522706aad39183aa9c2f0a33bfb4e3f not found: ID does not exist" Apr 21 04:16:01.307311 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.307287 2575 scope.go:117] "RemoveContainer" containerID="777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a" Apr 21 04:16:01.307523 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:16:01.307501 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a\": container with ID starting with 777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a not found: ID does not exist" containerID="777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a" Apr 21 04:16:01.307562 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:01.307527 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a"} err="failed to get container status \"777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a\": rpc error: code = NotFound desc = could not find container \"777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a\": container with ID starting with 777d607572338532df97046a79cbb1836e14029f1240ae4973fd840c8d0f009a not found: ID does not exist" Apr 21 04:16:03.027332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:03.027301 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" path="/var/lib/kubelet/pods/008ee74a-8022-47e7-b37b-18d45c4f78b3/volumes" Apr 21 04:16:04.294773 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:04.294683 2575 generic.go:358] "Generic (PLEG): container finished" podID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerID="b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9" exitCode=0 Apr 21 04:16:04.295139 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:04.294774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerDied","Data":"b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9"} Apr 21 04:16:05.299178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:05.299142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerStarted","Data":"d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096"} Apr 21 04:16:05.299178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:05.299179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerStarted","Data":"a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33"} Apr 21 04:16:05.299625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:05.299456 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:05.299625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:05.299577 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:05.300962 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:05.300934 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 21 04:16:05.316429 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:05.316390 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" podStartSLOduration=6.316379851 podStartE2EDuration="6.316379851s" podCreationTimestamp="2026-04-21 04:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:16:05.315439107 +0000 UTC m=+1150.980029193" watchObservedRunningTime="2026-04-21 04:16:05.316379851 +0000 UTC m=+1150.980969935" Apr 21 04:16:06.302519 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:06.302481 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 21 04:16:11.307057 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:11.307031 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:11.307885 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:11.307868 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:19.830154 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.830073 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5"] Apr 21 04:16:19.830637 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.830374 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kserve-container" containerID="cri-o://a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33" gracePeriod=30 Apr 21 04:16:19.830637 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.830426 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kube-rbac-proxy" containerID="cri-o://d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096" gracePeriod=30 Apr 21 04:16:19.892331 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892305 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x"] Apr 21 04:16:19.892608 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892597 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kserve-container" Apr 21 04:16:19.892648 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892610 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kserve-container" Apr 21 04:16:19.892648 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892624 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="storage-initializer" Apr 21 04:16:19.892648 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892630 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="storage-initializer" Apr 21 04:16:19.892648 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892642 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kube-rbac-proxy" Apr 21 04:16:19.892648 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892647 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kube-rbac-proxy" Apr 21 04:16:19.892844 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892688 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kube-rbac-proxy" Apr 21 04:16:19.892844 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.892698 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="008ee74a-8022-47e7-b37b-18d45c4f78b3" containerName="kserve-container" Apr 21 04:16:19.895720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.895703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:19.897982 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.897964 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 21 04:16:19.898059 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.898013 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:16:19.906133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:19.906103 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x"] Apr 21 04:16:20.049572 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.049539 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3d69fa-743e-4812-a276-9db0ef3cb813-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.049746 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.049579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-759h7\" (UniqueName: \"kubernetes.io/projected/8d3d69fa-743e-4812-a276-9db0ef3cb813-kube-api-access-759h7\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.049746 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.049605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3d69fa-743e-4812-a276-9db0ef3cb813-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.049746 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.049639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3d69fa-743e-4812-a276-9db0ef3cb813-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.150936 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.150887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3d69fa-743e-4812-a276-9db0ef3cb813-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.151131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.150951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-759h7\" (UniqueName: \"kubernetes.io/projected/8d3d69fa-743e-4812-a276-9db0ef3cb813-kube-api-access-759h7\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.151131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.150984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3d69fa-743e-4812-a276-9db0ef3cb813-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.151131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.151024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3d69fa-743e-4812-a276-9db0ef3cb813-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.151337 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.151312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3d69fa-743e-4812-a276-9db0ef3cb813-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.151829 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.151807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3d69fa-743e-4812-a276-9db0ef3cb813-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.153569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.153549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3d69fa-743e-4812-a276-9db0ef3cb813-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.159020 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.158994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-759h7\" (UniqueName: \"kubernetes.io/projected/8d3d69fa-743e-4812-a276-9db0ef3cb813-kube-api-access-759h7\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.206314 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.206260 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:20.324452 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.324355 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x"] Apr 21 04:16:20.339874 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.339848 2575 generic.go:358] "Generic (PLEG): container finished" podID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerID="d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096" exitCode=2 Apr 21 04:16:20.339984 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.339883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerDied","Data":"d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096"} Apr 21 04:16:20.352539 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:16:20.352515 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d3d69fa_743e_4812_a276_9db0ef3cb813.slice/crio-1ba5418f54c02eea04114930df80de9adcf0afb57e4df1cba9377450903e79c8 WatchSource:0}: Error finding container 1ba5418f54c02eea04114930df80de9adcf0afb57e4df1cba9377450903e79c8: Status 404 returned error can't find the container with id 1ba5418f54c02eea04114930df80de9adcf0afb57e4df1cba9377450903e79c8 Apr 21 04:16:20.454951 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.454930 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:20.553460 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.553427 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba5af0e-c7e3-435d-bd11-635a21ac9446-kserve-provision-location\") pod \"cba5af0e-c7e3-435d-bd11-635a21ac9446\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " Apr 21 04:16:20.553618 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.553479 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba5af0e-c7e3-435d-bd11-635a21ac9446-proxy-tls\") pod \"cba5af0e-c7e3-435d-bd11-635a21ac9446\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " Apr 21 04:16:20.553618 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.553516 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5chsc\" (UniqueName: \"kubernetes.io/projected/cba5af0e-c7e3-435d-bd11-635a21ac9446-kube-api-access-5chsc\") pod \"cba5af0e-c7e3-435d-bd11-635a21ac9446\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " Apr 21 04:16:20.553618 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.553547 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba5af0e-c7e3-435d-bd11-635a21ac9446-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"cba5af0e-c7e3-435d-bd11-635a21ac9446\" (UID: \"cba5af0e-c7e3-435d-bd11-635a21ac9446\") " Apr 21 04:16:20.553932 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.553888 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba5af0e-c7e3-435d-bd11-635a21ac9446-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cba5af0e-c7e3-435d-bd11-635a21ac9446" (UID: "cba5af0e-c7e3-435d-bd11-635a21ac9446"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:16:20.554011 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.553988 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba5af0e-c7e3-435d-bd11-635a21ac9446-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "cba5af0e-c7e3-435d-bd11-635a21ac9446" (UID: "cba5af0e-c7e3-435d-bd11-635a21ac9446"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:16:20.555625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.555600 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba5af0e-c7e3-435d-bd11-635a21ac9446-kube-api-access-5chsc" (OuterVolumeSpecName: "kube-api-access-5chsc") pod "cba5af0e-c7e3-435d-bd11-635a21ac9446" (UID: "cba5af0e-c7e3-435d-bd11-635a21ac9446"). InnerVolumeSpecName "kube-api-access-5chsc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:16:20.555714 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.555623 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba5af0e-c7e3-435d-bd11-635a21ac9446-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cba5af0e-c7e3-435d-bd11-635a21ac9446" (UID: "cba5af0e-c7e3-435d-bd11-635a21ac9446"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:16:20.654815 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.654747 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba5af0e-c7e3-435d-bd11-635a21ac9446-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:20.654815 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.654810 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba5af0e-c7e3-435d-bd11-635a21ac9446-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:20.654815 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.654824 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5chsc\" (UniqueName: \"kubernetes.io/projected/cba5af0e-c7e3-435d-bd11-635a21ac9446-kube-api-access-5chsc\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:20.655048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:20.654834 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba5af0e-c7e3-435d-bd11-635a21ac9446-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:16:21.344621 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.344546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerStarted","Data":"2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea"} Apr 21 04:16:21.344621 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.344580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerStarted","Data":"1ba5418f54c02eea04114930df80de9adcf0afb57e4df1cba9377450903e79c8"} Apr 21 04:16:21.346216 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.346188 2575 generic.go:358] "Generic (PLEG): container finished" podID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerID="a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33" exitCode=0 Apr 21 04:16:21.346327 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.346265 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" Apr 21 04:16:21.346369 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.346265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerDied","Data":"a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33"} Apr 21 04:16:21.346400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.346370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5" event={"ID":"cba5af0e-c7e3-435d-bd11-635a21ac9446","Type":"ContainerDied","Data":"0678fde42abae61e2fd45e3a9a976885675c4d804d78bebeb009aefa66b060fd"} Apr 21 04:16:21.346400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.346391 2575 scope.go:117] "RemoveContainer" containerID="d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096" Apr 21 04:16:21.353553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.353535 2575 scope.go:117] "RemoveContainer" containerID="a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33" Apr 21 04:16:21.360954 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.360936 2575 scope.go:117] "RemoveContainer" containerID="b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9" Apr 21 04:16:21.368135 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.368118 2575 scope.go:117] "RemoveContainer" containerID="d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096" Apr 21 04:16:21.368354 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:16:21.368336 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096\": container with ID starting with d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096 not found: ID does not exist" containerID="d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096" Apr 21 04:16:21.368419 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.368358 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096"} err="failed to get container status \"d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096\": rpc error: code = NotFound desc = could not find container \"d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096\": container with ID starting with d59cb421342597714f4aabfbdf4305f2f34bba4d797c5668008f6af4db1bb096 not found: ID does not exist" Apr 21 04:16:21.368419 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.368373 2575 scope.go:117] "RemoveContainer" containerID="a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33" Apr 21 04:16:21.368592 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:16:21.368577 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33\": container with ID starting with a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33 not found: ID does not exist" containerID="a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33" Apr 21 04:16:21.368646 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.368593 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33"} err="failed to get container status \"a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33\": rpc error: code = NotFound desc = could not find container \"a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33\": container with ID starting with a1234d55bd0c3f7c4e7260b9b5ab7ad78aa8a59afdd3179ccdb954fb46f0dc33 not found: ID does not exist" Apr 21 04:16:21.368646 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.368605 2575 scope.go:117] "RemoveContainer" containerID="b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9" Apr 21 04:16:21.368875 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:16:21.368862 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9\": container with ID starting with b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9 not found: ID does not exist" containerID="b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9" Apr 21 04:16:21.368935 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.368876 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9"} err="failed to get container status \"b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9\": rpc error: code = NotFound desc = could not find container \"b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9\": container with ID starting with b326f36ff65d3bb6eaa7d7a3093a2864a1605375a9d811c3b41eec4a385c04c9 not found: ID does not exist" Apr 21 04:16:21.373542 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.373523 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5"] Apr 21 04:16:21.376970 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:21.376950 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-xb9x5"] Apr 21 04:16:23.028083 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:23.028051 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" path="/var/lib/kubelet/pods/cba5af0e-c7e3-435d-bd11-635a21ac9446/volumes" Apr 21 04:16:25.360805 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:25.360750 2575 generic.go:358] "Generic (PLEG): container finished" podID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerID="2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea" exitCode=0 Apr 21 04:16:25.361265 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:25.360825 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerDied","Data":"2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea"} Apr 21 04:16:26.366332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:26.366296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerStarted","Data":"87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87"} Apr 21 04:16:26.366332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:26.366336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerStarted","Data":"391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4"} Apr 21 04:16:26.366793 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:26.366563 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:26.366793 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:26.366626 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:16:26.388178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:26.388132 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" podStartSLOduration=7.388120294 podStartE2EDuration="7.388120294s" podCreationTimestamp="2026-04-21 04:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:16:26.386040122 +0000 UTC m=+1172.050630207" watchObservedRunningTime="2026-04-21 04:16:26.388120294 +0000 UTC m=+1172.052710379" Apr 21 04:16:32.374616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:16:32.374584 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:17:02.378038 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:02.378011 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:17:09.981657 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:09.981620 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x"] Apr 21 04:17:09.982208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:09.981965 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kserve-container" containerID="cri-o://391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4" gracePeriod=30 Apr 21 04:17:09.982208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:09.981993 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kube-rbac-proxy" containerID="cri-o://87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87" gracePeriod=30 Apr 21 04:17:10.078752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.078712 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72"] Apr 21 04:17:10.079073 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079057 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kserve-container" Apr 21 04:17:10.079073 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079075 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kserve-container" Apr 21 04:17:10.079169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079084 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kube-rbac-proxy" Apr 21 04:17:10.079169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079089 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kube-rbac-proxy" Apr 21 04:17:10.079169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079108 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="storage-initializer" Apr 21 04:17:10.079169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079113 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="storage-initializer" Apr 21 04:17:10.079169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079161 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kserve-container" Apr 21 04:17:10.079169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.079168 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cba5af0e-c7e3-435d-bd11-635a21ac9446" containerName="kube-rbac-proxy" Apr 21 04:17:10.083734 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.083715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.086194 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.086168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 21 04:17:10.086313 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.086299 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 21 04:17:10.091475 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.091209 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72"] Apr 21 04:17:10.193494 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.193462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.193685 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.193511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.193685 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.193590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.193685 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.193624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26v42\" (UniqueName: \"kubernetes.io/projected/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kube-api-access-26v42\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.294333 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.294227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.294333 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.294286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.294333 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.294329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.294623 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.294352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26v42\" (UniqueName: \"kubernetes.io/projected/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kube-api-access-26v42\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.294805 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.294748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.295059 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.295035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.296864 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.296839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.302732 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.302705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26v42\" (UniqueName: \"kubernetes.io/projected/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kube-api-access-26v42\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.396551 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.396515 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:10.488853 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.488822 2575 generic.go:358] "Generic (PLEG): container finished" podID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerID="87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87" exitCode=2 Apr 21 04:17:10.489006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.488884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerDied","Data":"87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87"} Apr 21 04:17:10.515255 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:10.515233 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72"] Apr 21 04:17:10.517558 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:17:10.517537 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0225ebd6_66b5_4fc5_ab45_53d8252c3e3f.slice/crio-7b0d30b57936fd67a0e1801f3378d4070b606180f2184db406b5e7032151dd32 WatchSource:0}: Error finding container 7b0d30b57936fd67a0e1801f3378d4070b606180f2184db406b5e7032151dd32: Status 404 returned error can't find the container with id 7b0d30b57936fd67a0e1801f3378d4070b606180f2184db406b5e7032151dd32 Apr 21 04:17:11.130280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.130259 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:17:11.199877 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.199805 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3d69fa-743e-4812-a276-9db0ef3cb813-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"8d3d69fa-743e-4812-a276-9db0ef3cb813\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " Apr 21 04:17:11.199877 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.199868 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-759h7\" (UniqueName: \"kubernetes.io/projected/8d3d69fa-743e-4812-a276-9db0ef3cb813-kube-api-access-759h7\") pod \"8d3d69fa-743e-4812-a276-9db0ef3cb813\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " Apr 21 04:17:11.200064 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.199909 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3d69fa-743e-4812-a276-9db0ef3cb813-kserve-provision-location\") pod \"8d3d69fa-743e-4812-a276-9db0ef3cb813\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " Apr 21 04:17:11.200130 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.200107 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3d69fa-743e-4812-a276-9db0ef3cb813-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "8d3d69fa-743e-4812-a276-9db0ef3cb813" (UID: "8d3d69fa-743e-4812-a276-9db0ef3cb813"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:17:11.200229 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.200207 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d3d69fa-743e-4812-a276-9db0ef3cb813-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d3d69fa-743e-4812-a276-9db0ef3cb813" (UID: "8d3d69fa-743e-4812-a276-9db0ef3cb813"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:17:11.201743 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.201723 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3d69fa-743e-4812-a276-9db0ef3cb813-kube-api-access-759h7" (OuterVolumeSpecName: "kube-api-access-759h7") pod "8d3d69fa-743e-4812-a276-9db0ef3cb813" (UID: "8d3d69fa-743e-4812-a276-9db0ef3cb813"). InnerVolumeSpecName "kube-api-access-759h7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:17:11.301174 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.301152 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3d69fa-743e-4812-a276-9db0ef3cb813-proxy-tls\") pod \"8d3d69fa-743e-4812-a276-9db0ef3cb813\" (UID: \"8d3d69fa-743e-4812-a276-9db0ef3cb813\") " Apr 21 04:17:11.301294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.301283 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3d69fa-743e-4812-a276-9db0ef3cb813-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:17:11.301332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.301297 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3d69fa-743e-4812-a276-9db0ef3cb813-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:17:11.301332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.301309 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-759h7\" (UniqueName: \"kubernetes.io/projected/8d3d69fa-743e-4812-a276-9db0ef3cb813-kube-api-access-759h7\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:17:11.303008 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.302977 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d3d69fa-743e-4812-a276-9db0ef3cb813-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8d3d69fa-743e-4812-a276-9db0ef3cb813" (UID: "8d3d69fa-743e-4812-a276-9db0ef3cb813"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:17:11.401734 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.401714 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3d69fa-743e-4812-a276-9db0ef3cb813-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:17:11.492722 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.492654 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerStarted","Data":"dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126"} Apr 21 04:17:11.492722 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.492692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerStarted","Data":"7b0d30b57936fd67a0e1801f3378d4070b606180f2184db406b5e7032151dd32"} Apr 21 04:17:11.494305 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.494278 2575 generic.go:358] "Generic (PLEG): container finished" podID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerID="391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4" exitCode=0 Apr 21 04:17:11.494384 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.494334 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" Apr 21 04:17:11.494384 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.494357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerDied","Data":"391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4"} Apr 21 04:17:11.494468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.494386 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x" event={"ID":"8d3d69fa-743e-4812-a276-9db0ef3cb813","Type":"ContainerDied","Data":"1ba5418f54c02eea04114930df80de9adcf0afb57e4df1cba9377450903e79c8"} Apr 21 04:17:11.494468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.494403 2575 scope.go:117] "RemoveContainer" containerID="87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87" Apr 21 04:17:11.501838 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.501819 2575 scope.go:117] "RemoveContainer" containerID="391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4" Apr 21 04:17:11.509181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.509158 2575 scope.go:117] "RemoveContainer" containerID="2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea" Apr 21 04:17:11.516181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.516158 2575 scope.go:117] "RemoveContainer" containerID="87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87" Apr 21 04:17:11.516428 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:17:11.516410 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87\": container with ID starting with 87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87 not found: ID does not exist" containerID="87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87" Apr 21 04:17:11.516476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.516438 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87"} err="failed to get container status \"87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87\": rpc error: code = NotFound desc = could not find container \"87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87\": container with ID starting with 87f8636a63665f5c76220fee6597cd56e54e64f1594cbf4e15cc5806990dac87 not found: ID does not exist" Apr 21 04:17:11.516476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.516456 2575 scope.go:117] "RemoveContainer" containerID="391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4" Apr 21 04:17:11.516685 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:17:11.516668 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4\": container with ID starting with 391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4 not found: ID does not exist" containerID="391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4" Apr 21 04:17:11.516748 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.516694 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4"} err="failed to get container status \"391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4\": rpc error: code = NotFound desc = could not find container \"391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4\": container with ID starting with 391a142d549fe96b4d968d94aaf9fd9e3e21d2dba88479de773da967583115c4 not found: ID does not exist" Apr 21 04:17:11.516748 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.516717 2575 scope.go:117] "RemoveContainer" containerID="2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea" Apr 21 04:17:11.517002 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:17:11.516985 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea\": container with ID starting with 2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea not found: ID does not exist" containerID="2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea" Apr 21 04:17:11.517048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.517008 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea"} err="failed to get container status \"2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea\": rpc error: code = NotFound desc = could not find container \"2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea\": container with ID starting with 2f104e92bf1c514a396f74b40393db3bc0cb1078fe92d482c12779fecf761bea not found: ID does not exist" Apr 21 04:17:11.525211 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.525190 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x"] Apr 21 04:17:11.526836 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:11.526813 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-shp6x"] Apr 21 04:17:13.027895 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:13.027865 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" path="/var/lib/kubelet/pods/8d3d69fa-743e-4812-a276-9db0ef3cb813/volumes" Apr 21 04:17:14.504824 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:14.504791 2575 generic.go:358] "Generic (PLEG): container finished" podID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerID="dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126" exitCode=0 Apr 21 04:17:14.505181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:14.504860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerDied","Data":"dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126"} Apr 21 04:17:15.509403 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:15.509368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerStarted","Data":"60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1"} Apr 21 04:17:18.520936 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:18.520897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerStarted","Data":"7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f"} Apr 21 04:17:18.520936 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:18.520936 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerStarted","Data":"e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2"} Apr 21 04:17:18.521461 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:18.521184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:18.521461 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:18.521199 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:18.545902 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:18.545856 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podStartSLOduration=5.480503172 podStartE2EDuration="8.545842243s" podCreationTimestamp="2026-04-21 04:17:10 +0000 UTC" firstStartedPulling="2026-04-21 04:17:14.561985124 +0000 UTC m=+1220.226575187" lastFinishedPulling="2026-04-21 04:17:17.627324195 +0000 UTC m=+1223.291914258" observedRunningTime="2026-04-21 04:17:18.544329189 +0000 UTC m=+1224.208919274" watchObservedRunningTime="2026-04-21 04:17:18.545842243 +0000 UTC m=+1224.210432327" Apr 21 04:17:19.524162 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:19.524131 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:25.531631 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:25.531600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:17:45.533882 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:17:45.533804 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:18:25.534596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:25.534567 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:18:30.151020 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.150977 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72"] Apr 21 04:18:30.151498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.151443 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-container" containerID="cri-o://60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1" gracePeriod=30 Apr 21 04:18:30.151686 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.151531 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-agent" containerID="cri-o://e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2" gracePeriod=30 Apr 21 04:18:30.151807 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.151659 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" containerID="cri-o://7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f" gracePeriod=30 Apr 21 04:18:30.233865 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.233834 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6"] Apr 21 04:18:30.234135 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234123 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="storage-initializer" Apr 21 04:18:30.234187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234137 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="storage-initializer" Apr 21 04:18:30.234187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234146 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kserve-container" Apr 21 04:18:30.234187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234152 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kserve-container" Apr 21 04:18:30.234187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234164 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kube-rbac-proxy" Apr 21 04:18:30.234187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234171 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kube-rbac-proxy" Apr 21 04:18:30.234339 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234214 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kserve-container" Apr 21 04:18:30.234339 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.234221 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d3d69fa-743e-4812-a276-9db0ef3cb813" containerName="kube-rbac-proxy" Apr 21 04:18:30.237231 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.237209 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.239641 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.239622 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 21 04:18:30.239741 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.239625 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 21 04:18:30.247130 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.247109 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6"] Apr 21 04:18:30.353201 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.353172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.353305 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.353218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/735f71e3-c536-4a66-943a-f8d73cf6559c-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.353353 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.353313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/735f71e3-c536-4a66-943a-f8d73cf6559c-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.353399 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.353348 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t74r\" (UniqueName: \"kubernetes.io/projected/735f71e3-c536-4a66-943a-f8d73cf6559c-kube-api-access-6t74r\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.454548 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.454488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6t74r\" (UniqueName: \"kubernetes.io/projected/735f71e3-c536-4a66-943a-f8d73cf6559c-kube-api-access-6t74r\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.454548 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.454540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.454728 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.454566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/735f71e3-c536-4a66-943a-f8d73cf6559c-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.454728 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.454609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/735f71e3-c536-4a66-943a-f8d73cf6559c-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.454728 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:18:30.454703 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-predictor-serving-cert: secret "isvc-paddle-predictor-serving-cert" not found Apr 21 04:18:30.454909 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:18:30.454787 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls podName:735f71e3-c536-4a66-943a-f8d73cf6559c nodeName:}" failed. No retries permitted until 2026-04-21 04:18:30.954743992 +0000 UTC m=+1296.619334073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls") pod "isvc-paddle-predictor-6b8b7cfb4b-842g6" (UID: "735f71e3-c536-4a66-943a-f8d73cf6559c") : secret "isvc-paddle-predictor-serving-cert" not found Apr 21 04:18:30.455014 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.454995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/735f71e3-c536-4a66-943a-f8d73cf6559c-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.455292 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.455271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/735f71e3-c536-4a66-943a-f8d73cf6559c-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.463327 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.463308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t74r\" (UniqueName: \"kubernetes.io/projected/735f71e3-c536-4a66-943a-f8d73cf6559c-kube-api-access-6t74r\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.527436 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.527405 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 21 04:18:30.735506 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.735418 2575 generic.go:358] "Generic (PLEG): container finished" podID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerID="7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f" exitCode=2 Apr 21 04:18:30.735661 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.735497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerDied","Data":"7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f"} Apr 21 04:18:30.959462 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.959427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:30.961747 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:30.961722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-842g6\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:31.147690 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:31.147661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:31.263630 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:31.263594 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6"] Apr 21 04:18:31.267628 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:18:31.267587 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod735f71e3_c536_4a66_943a_f8d73cf6559c.slice/crio-21d7a1da34bd53bc5bbd94b9550dc199289251803249593870fde61c6fd88b4e WatchSource:0}: Error finding container 21d7a1da34bd53bc5bbd94b9550dc199289251803249593870fde61c6fd88b4e: Status 404 returned error can't find the container with id 21d7a1da34bd53bc5bbd94b9550dc199289251803249593870fde61c6fd88b4e Apr 21 04:18:31.739252 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:31.739217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerStarted","Data":"93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb"} Apr 21 04:18:31.739252 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:31.739252 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerStarted","Data":"21d7a1da34bd53bc5bbd94b9550dc199289251803249593870fde61c6fd88b4e"} Apr 21 04:18:32.744353 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:32.744314 2575 generic.go:358] "Generic (PLEG): container finished" podID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerID="60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1" exitCode=0 Apr 21 04:18:32.744711 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:32.744387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerDied","Data":"60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1"} Apr 21 04:18:35.527589 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:35.527549 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 21 04:18:35.532856 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:35.532829 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 21 04:18:36.756875 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:36.756842 2575 generic.go:358] "Generic (PLEG): container finished" podID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerID="93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb" exitCode=0 Apr 21 04:18:36.757214 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:36.756916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerDied","Data":"93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb"} Apr 21 04:18:40.527729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:40.527686 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 21 04:18:40.528197 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:40.527849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:18:45.527570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:45.527531 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 21 04:18:45.532985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:45.532954 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 21 04:18:48.793371 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:48.793287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerStarted","Data":"37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449"} Apr 21 04:18:48.793371 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:48.793331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerStarted","Data":"a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb"} Apr 21 04:18:48.793911 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:48.793554 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:48.811204 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:48.811159 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podStartSLOduration=7.166109742 podStartE2EDuration="18.811146659s" podCreationTimestamp="2026-04-21 04:18:30 +0000 UTC" firstStartedPulling="2026-04-21 04:18:36.757992502 +0000 UTC m=+1302.422582566" lastFinishedPulling="2026-04-21 04:18:48.403029416 +0000 UTC m=+1314.067619483" observedRunningTime="2026-04-21 04:18:48.810276156 +0000 UTC m=+1314.474866241" watchObservedRunningTime="2026-04-21 04:18:48.811146659 +0000 UTC m=+1314.475736745" Apr 21 04:18:49.796428 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:49.796398 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:49.797420 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:49.797396 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 21 04:18:50.527601 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:50.527551 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 21 04:18:50.799929 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:50.799842 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 21 04:18:55.526788 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:55.526722 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 21 04:18:55.533187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:55.533160 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 21 04:18:55.533294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:55.533280 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:18:55.804314 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:55.804232 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:18:55.804834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:18:55.804810 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 21 04:19:00.289862 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.289839 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:19:00.389147 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389118 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " Apr 21 04:19:00.389289 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389161 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-proxy-tls\") pod \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " Apr 21 04:19:00.389289 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389184 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kserve-provision-location\") pod \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " Apr 21 04:19:00.389378 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389362 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26v42\" (UniqueName: \"kubernetes.io/projected/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kube-api-access-26v42\") pod \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\" (UID: \"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f\") " Apr 21 04:19:00.389491 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389468 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" (UID: "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:19:00.389564 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389516 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" (UID: "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:19:00.389622 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389598 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:00.389622 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.389611 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:00.391203 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.391175 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" (UID: "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:19:00.391404 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.391383 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kube-api-access-26v42" (OuterVolumeSpecName: "kube-api-access-26v42") pod "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" (UID: "0225ebd6-66b5-4fc5-ab45-53d8252c3e3f"). InnerVolumeSpecName "kube-api-access-26v42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:19:00.490105 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.490043 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:00.490105 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.490064 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26v42\" (UniqueName: \"kubernetes.io/projected/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f-kube-api-access-26v42\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:00.827950 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.827865 2575 generic.go:358] "Generic (PLEG): container finished" podID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerID="e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2" exitCode=137 Apr 21 04:19:00.827950 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.827920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerDied","Data":"e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2"} Apr 21 04:19:00.827950 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.827946 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" event={"ID":"0225ebd6-66b5-4fc5-ab45-53d8252c3e3f","Type":"ContainerDied","Data":"7b0d30b57936fd67a0e1801f3378d4070b606180f2184db406b5e7032151dd32"} Apr 21 04:19:00.828208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.827962 2575 scope.go:117] "RemoveContainer" containerID="7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f" Apr 21 04:19:00.828208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.827968 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72" Apr 21 04:19:00.835698 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.835678 2575 scope.go:117] "RemoveContainer" containerID="e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2" Apr 21 04:19:00.842304 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.842286 2575 scope.go:117] "RemoveContainer" containerID="60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1" Apr 21 04:19:00.849313 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.849285 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72"] Apr 21 04:19:00.849848 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.849830 2575 scope.go:117] "RemoveContainer" containerID="dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126" Apr 21 04:19:00.851914 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.851894 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-rgm72"] Apr 21 04:19:00.857052 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.857035 2575 scope.go:117] "RemoveContainer" containerID="7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f" Apr 21 04:19:00.857303 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:19:00.857283 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f\": container with ID starting with 7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f not found: ID does not exist" containerID="7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f" Apr 21 04:19:00.857361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.857309 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f"} err="failed to get container status \"7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f\": rpc error: code = NotFound desc = could not find container \"7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f\": container with ID starting with 7192c137589922b2e92709055468e2ce174a4ecde80e6e498ab54a509ba4d48f not found: ID does not exist" Apr 21 04:19:00.857361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.857327 2575 scope.go:117] "RemoveContainer" containerID="e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2" Apr 21 04:19:00.857551 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:19:00.857538 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2\": container with ID starting with e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2 not found: ID does not exist" containerID="e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2" Apr 21 04:19:00.857594 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.857554 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2"} err="failed to get container status \"e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2\": rpc error: code = NotFound desc = could not find container \"e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2\": container with ID starting with e3055435efb7ae8e22fbd22523ba6d90e4e5fbcd4fcc9120ab3a40dda42184d2 not found: ID does not exist" Apr 21 04:19:00.857594 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.857564 2575 scope.go:117] "RemoveContainer" containerID="60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1" Apr 21 04:19:00.857787 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:19:00.857769 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1\": container with ID starting with 60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1 not found: ID does not exist" containerID="60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1" Apr 21 04:19:00.857847 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.857791 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1"} err="failed to get container status \"60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1\": rpc error: code = NotFound desc = could not find container \"60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1\": container with ID starting with 60c8b07732e1c7e19691946170884a2133ca8351bd77443d45dddd983e5603d1 not found: ID does not exist" Apr 21 04:19:00.857847 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.857804 2575 scope.go:117] "RemoveContainer" containerID="dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126" Apr 21 04:19:00.857998 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:19:00.857984 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126\": container with ID starting with dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126 not found: ID does not exist" containerID="dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126" Apr 21 04:19:00.858045 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:00.858000 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126"} err="failed to get container status \"dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126\": rpc error: code = NotFound desc = could not find container \"dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126\": container with ID starting with dd9ff3f45a7b81581a3939ad2f510895af2998af0754fd32a23237957bf59126 not found: ID does not exist" Apr 21 04:19:01.027219 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:01.027193 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" path="/var/lib/kubelet/pods/0225ebd6-66b5-4fc5-ab45-53d8252c3e3f/volumes" Apr 21 04:19:05.805301 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:05.805263 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 21 04:19:15.805428 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:15.805343 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 21 04:19:25.805633 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:25.805595 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 21 04:19:35.805943 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:35.805912 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:19:41.715903 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.715870 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6"] Apr 21 04:19:41.716308 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.716261 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" containerID="cri-o://a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb" gracePeriod=30 Apr 21 04:19:41.716380 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.716298 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kube-rbac-proxy" containerID="cri-o://37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449" gracePeriod=30 Apr 21 04:19:41.798895 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.798867 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49"] Apr 21 04:19:41.799131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799119 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-container" Apr 21 04:19:41.799176 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799133 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-container" Apr 21 04:19:41.799176 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799146 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" Apr 21 04:19:41.799176 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799151 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" Apr 21 04:19:41.799176 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799165 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="storage-initializer" Apr 21 04:19:41.799176 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799170 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="storage-initializer" Apr 21 04:19:41.799323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799181 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-agent" Apr 21 04:19:41.799323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799186 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-agent" Apr 21 04:19:41.799323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799229 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-container" Apr 21 04:19:41.799323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799236 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kube-rbac-proxy" Apr 21 04:19:41.799323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.799243 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0225ebd6-66b5-4fc5-ab45-53d8252c3e3f" containerName="kserve-agent" Apr 21 04:19:41.804506 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.804489 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.806948 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.806928 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:19:41.807038 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.806934 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 21 04:19:41.810322 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.810302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49"] Apr 21 04:19:41.880054 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.880026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.880178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.880076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sr7m\" (UniqueName: \"kubernetes.io/projected/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kube-api-access-5sr7m\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.880178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.880124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.880178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.880159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.948430 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.948405 2575 generic.go:358] "Generic (PLEG): container finished" podID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerID="37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449" exitCode=2 Apr 21 04:19:41.948549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.948470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerDied","Data":"37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449"} Apr 21 04:19:41.980945 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.980891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sr7m\" (UniqueName: \"kubernetes.io/projected/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kube-api-access-5sr7m\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.980945 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.980923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.981072 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.980945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.981072 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.980990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.981440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.981417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.981739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.981710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.983395 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.983376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:41.988882 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:41.988864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sr7m\" (UniqueName: \"kubernetes.io/projected/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kube-api-access-5sr7m\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:42.115591 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:42.115556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:42.230434 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:42.230409 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49"] Apr 21 04:19:42.232937 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:19:42.232872 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce7e5ad3_0b4d_4e0c_a300_847809b932c9.slice/crio-09d2a4f0a14983a11bd5f4ebd5175f3919ce20c0d7cbc576c7d3c282d4b35588 WatchSource:0}: Error finding container 09d2a4f0a14983a11bd5f4ebd5175f3919ce20c0d7cbc576c7d3c282d4b35588: Status 404 returned error can't find the container with id 09d2a4f0a14983a11bd5f4ebd5175f3919ce20c0d7cbc576c7d3c282d4b35588 Apr 21 04:19:42.953313 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:42.953270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerStarted","Data":"0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775"} Apr 21 04:19:42.953313 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:42.953315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerStarted","Data":"09d2a4f0a14983a11bd5f4ebd5175f3919ce20c0d7cbc576c7d3c282d4b35588"} Apr 21 04:19:44.253226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.253197 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:19:44.300086 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.300025 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/735f71e3-c536-4a66-943a-f8d73cf6559c-kserve-provision-location\") pod \"735f71e3-c536-4a66-943a-f8d73cf6559c\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " Apr 21 04:19:44.300205 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.300088 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t74r\" (UniqueName: \"kubernetes.io/projected/735f71e3-c536-4a66-943a-f8d73cf6559c-kube-api-access-6t74r\") pod \"735f71e3-c536-4a66-943a-f8d73cf6559c\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " Apr 21 04:19:44.300205 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.300116 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls\") pod \"735f71e3-c536-4a66-943a-f8d73cf6559c\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " Apr 21 04:19:44.300205 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.300162 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/735f71e3-c536-4a66-943a-f8d73cf6559c-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"735f71e3-c536-4a66-943a-f8d73cf6559c\" (UID: \"735f71e3-c536-4a66-943a-f8d73cf6559c\") " Apr 21 04:19:44.300549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.300522 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/735f71e3-c536-4a66-943a-f8d73cf6559c-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "735f71e3-c536-4a66-943a-f8d73cf6559c" (UID: "735f71e3-c536-4a66-943a-f8d73cf6559c"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:19:44.302160 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.302138 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735f71e3-c536-4a66-943a-f8d73cf6559c-kube-api-access-6t74r" (OuterVolumeSpecName: "kube-api-access-6t74r") pod "735f71e3-c536-4a66-943a-f8d73cf6559c" (UID: "735f71e3-c536-4a66-943a-f8d73cf6559c"). InnerVolumeSpecName "kube-api-access-6t74r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:19:44.302307 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.302286 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "735f71e3-c536-4a66-943a-f8d73cf6559c" (UID: "735f71e3-c536-4a66-943a-f8d73cf6559c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:19:44.309739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.309716 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/735f71e3-c536-4a66-943a-f8d73cf6559c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "735f71e3-c536-4a66-943a-f8d73cf6559c" (UID: "735f71e3-c536-4a66-943a-f8d73cf6559c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:19:44.401628 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.401607 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/735f71e3-c536-4a66-943a-f8d73cf6559c-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:44.401628 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.401626 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6t74r\" (UniqueName: \"kubernetes.io/projected/735f71e3-c536-4a66-943a-f8d73cf6559c-kube-api-access-6t74r\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:44.401740 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.401636 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/735f71e3-c536-4a66-943a-f8d73cf6559c-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:44.401740 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.401647 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/735f71e3-c536-4a66-943a-f8d73cf6559c-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:19:44.960115 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.960077 2575 generic.go:358] "Generic (PLEG): container finished" podID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerID="a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb" exitCode=0 Apr 21 04:19:44.960293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.960166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerDied","Data":"a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb"} Apr 21 04:19:44.960293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.960187 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" Apr 21 04:19:44.960293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.960206 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6" event={"ID":"735f71e3-c536-4a66-943a-f8d73cf6559c","Type":"ContainerDied","Data":"21d7a1da34bd53bc5bbd94b9550dc199289251803249593870fde61c6fd88b4e"} Apr 21 04:19:44.960293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.960223 2575 scope.go:117] "RemoveContainer" containerID="37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449" Apr 21 04:19:44.968161 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.968144 2575 scope.go:117] "RemoveContainer" containerID="a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb" Apr 21 04:19:44.974726 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.974711 2575 scope.go:117] "RemoveContainer" containerID="93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb" Apr 21 04:19:44.981018 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.980997 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6"] Apr 21 04:19:44.981483 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.981463 2575 scope.go:117] "RemoveContainer" containerID="37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449" Apr 21 04:19:44.981720 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:19:44.981703 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449\": container with ID starting with 37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449 not found: ID does not exist" containerID="37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449" Apr 21 04:19:44.981804 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.981729 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449"} err="failed to get container status \"37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449\": rpc error: code = NotFound desc = could not find container \"37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449\": container with ID starting with 37bd9d1c6bba68954fc82c70f47803e74f010ce35c58a946592fb7bd9aa8a449 not found: ID does not exist" Apr 21 04:19:44.981804 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.981751 2575 scope.go:117] "RemoveContainer" containerID="a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb" Apr 21 04:19:44.981983 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:19:44.981958 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb\": container with ID starting with a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb not found: ID does not exist" containerID="a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb" Apr 21 04:19:44.982049 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.981991 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb"} err="failed to get container status \"a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb\": rpc error: code = NotFound desc = could not find container \"a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb\": container with ID starting with a8a6c2150f6cc66f09cae3d6a9685fed1eaab689111c848780dd7c5d38b689bb not found: ID does not exist" Apr 21 04:19:44.982049 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.982010 2575 scope.go:117] "RemoveContainer" containerID="93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb" Apr 21 04:19:44.982225 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:19:44.982209 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb\": container with ID starting with 93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb not found: ID does not exist" containerID="93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb" Apr 21 04:19:44.982275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.982231 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb"} err="failed to get container status \"93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb\": rpc error: code = NotFound desc = could not find container \"93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb\": container with ID starting with 93e48f64f31590ef72600ba99051eb4550977bbbcad3a9a47ce75d2059bb3ffb not found: ID does not exist" Apr 21 04:19:44.984897 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:44.984878 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-842g6"] Apr 21 04:19:45.027870 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:45.027846 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" path="/var/lib/kubelet/pods/735f71e3-c536-4a66-943a-f8d73cf6559c/volumes" Apr 21 04:19:46.968611 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:46.968575 2575 generic.go:358] "Generic (PLEG): container finished" podID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerID="0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775" exitCode=0 Apr 21 04:19:46.968991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:46.968647 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerDied","Data":"0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775"} Apr 21 04:19:47.973580 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:47.973545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerStarted","Data":"93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22"} Apr 21 04:19:47.973580 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:47.973586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerStarted","Data":"a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1"} Apr 21 04:19:47.974071 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:47.973865 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:47.974071 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:47.974007 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:47.975169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:47.975142 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 21 04:19:47.992237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:47.992189 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podStartSLOduration=6.9921719289999995 podStartE2EDuration="6.992171929s" podCreationTimestamp="2026-04-21 04:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:19:47.990330278 +0000 UTC m=+1373.654920365" watchObservedRunningTime="2026-04-21 04:19:47.992171929 +0000 UTC m=+1373.656762016" Apr 21 04:19:48.976389 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:48.976356 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 21 04:19:53.981087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:53.981060 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:19:53.981635 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:19:53.981603 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 21 04:20:03.982251 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:03.982208 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 21 04:20:13.982313 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:13.982272 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 21 04:20:23.981538 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:23.981482 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 21 04:20:33.981956 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:33.981925 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:20:43.277549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.277517 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49"] Apr 21 04:20:43.278138 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.277953 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" containerID="cri-o://a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1" gracePeriod=30 Apr 21 04:20:43.278138 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.278037 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kube-rbac-proxy" containerID="cri-o://93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22" gracePeriod=30 Apr 21 04:20:43.384374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384346 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5"] Apr 21 04:20:43.384651 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384639 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kube-rbac-proxy" Apr 21 04:20:43.384718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384653 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kube-rbac-proxy" Apr 21 04:20:43.384718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384662 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="storage-initializer" Apr 21 04:20:43.384718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384668 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="storage-initializer" Apr 21 04:20:43.384718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384686 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" Apr 21 04:20:43.384718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384692 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" Apr 21 04:20:43.384887 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384737 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kube-rbac-proxy" Apr 21 04:20:43.384887 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.384745 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="735f71e3-c536-4a66-943a-f8d73cf6559c" containerName="kserve-container" Apr 21 04:20:43.386649 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.386631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.388936 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.388918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 21 04:20:43.389075 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.389058 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 21 04:20:43.395195 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.395176 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5"] Apr 21 04:20:43.527666 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.527593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.527666 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.527641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szc2f\" (UniqueName: \"kubernetes.io/projected/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kube-api-access-szc2f\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.527857 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.527721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47536376-5ab9-4bbe-9f8b-6bae72ec4106-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.527857 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.527779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47536376-5ab9-4bbe-9f8b-6bae72ec4106-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.629010 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.628978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szc2f\" (UniqueName: \"kubernetes.io/projected/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kube-api-access-szc2f\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.629153 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.629025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47536376-5ab9-4bbe-9f8b-6bae72ec4106-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.629153 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.629055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47536376-5ab9-4bbe-9f8b-6bae72ec4106-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.629153 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.629118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.629554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.629520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.629684 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.629660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47536376-5ab9-4bbe-9f8b-6bae72ec4106-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.631567 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.631545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47536376-5ab9-4bbe-9f8b-6bae72ec4106-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.637408 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.637383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szc2f\" (UniqueName: \"kubernetes.io/projected/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kube-api-access-szc2f\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.697141 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.697119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:43.812917 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.812780 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5"] Apr 21 04:20:43.815437 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:20:43.815413 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47536376_5ab9_4bbe_9f8b_6bae72ec4106.slice/crio-37415ff5922f0ccd3e0d9e06b2eb28269e897f4b0968a67a241fadadb7497c6c WatchSource:0}: Error finding container 37415ff5922f0ccd3e0d9e06b2eb28269e897f4b0968a67a241fadadb7497c6c: Status 404 returned error can't find the container with id 37415ff5922f0ccd3e0d9e06b2eb28269e897f4b0968a67a241fadadb7497c6c Apr 21 04:20:43.817358 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.817345 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:20:43.977212 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.977173 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 21 04:20:43.981530 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:43.981508 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 21 04:20:44.123341 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:44.123306 2575 generic.go:358] "Generic (PLEG): container finished" podID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerID="93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22" exitCode=2 Apr 21 04:20:44.123341 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:44.123336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerDied","Data":"93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22"} Apr 21 04:20:44.124586 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:44.124564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerStarted","Data":"616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa"} Apr 21 04:20:44.124685 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:44.124591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerStarted","Data":"37415ff5922f0ccd3e0d9e06b2eb28269e897f4b0968a67a241fadadb7497c6c"} Apr 21 04:20:45.823832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.823807 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:20:45.847344 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.847317 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kserve-provision-location\") pod \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " Apr 21 04:20:45.847439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.847387 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " Apr 21 04:20:45.847439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.847420 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sr7m\" (UniqueName: \"kubernetes.io/projected/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kube-api-access-5sr7m\") pod \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " Apr 21 04:20:45.847570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.847463 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-proxy-tls\") pod \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\" (UID: \"ce7e5ad3-0b4d-4e0c-a300-847809b932c9\") " Apr 21 04:20:45.847723 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.847691 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "ce7e5ad3-0b4d-4e0c-a300-847809b932c9" (UID: "ce7e5ad3-0b4d-4e0c-a300-847809b932c9"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:20:45.849614 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.849586 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ce7e5ad3-0b4d-4e0c-a300-847809b932c9" (UID: "ce7e5ad3-0b4d-4e0c-a300-847809b932c9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:20:45.849965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.849927 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kube-api-access-5sr7m" (OuterVolumeSpecName: "kube-api-access-5sr7m") pod "ce7e5ad3-0b4d-4e0c-a300-847809b932c9" (UID: "ce7e5ad3-0b4d-4e0c-a300-847809b932c9"). InnerVolumeSpecName "kube-api-access-5sr7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:20:45.857448 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.857427 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce7e5ad3-0b4d-4e0c-a300-847809b932c9" (UID: "ce7e5ad3-0b4d-4e0c-a300-847809b932c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:20:45.948308 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.948253 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:20:45.948308 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.948276 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:20:45.948308 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.948286 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:20:45.948308 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:45.948297 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5sr7m\" (UniqueName: \"kubernetes.io/projected/ce7e5ad3-0b4d-4e0c-a300-847809b932c9-kube-api-access-5sr7m\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:20:46.131009 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.130978 2575 generic.go:358] "Generic (PLEG): container finished" podID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerID="a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1" exitCode=0 Apr 21 04:20:46.131126 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.131054 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" Apr 21 04:20:46.131126 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.131054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerDied","Data":"a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1"} Apr 21 04:20:46.131126 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.131091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49" event={"ID":"ce7e5ad3-0b4d-4e0c-a300-847809b932c9","Type":"ContainerDied","Data":"09d2a4f0a14983a11bd5f4ebd5175f3919ce20c0d7cbc576c7d3c282d4b35588"} Apr 21 04:20:46.131126 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.131107 2575 scope.go:117] "RemoveContainer" containerID="93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22" Apr 21 04:20:46.139624 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.139494 2575 scope.go:117] "RemoveContainer" containerID="a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1" Apr 21 04:20:46.146187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.146162 2575 scope.go:117] "RemoveContainer" containerID="0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775" Apr 21 04:20:46.152411 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.152391 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49"] Apr 21 04:20:46.153075 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.153060 2575 scope.go:117] "RemoveContainer" containerID="93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22" Apr 21 04:20:46.153303 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:20:46.153287 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22\": container with ID starting with 93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22 not found: ID does not exist" containerID="93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22" Apr 21 04:20:46.153346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.153318 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22"} err="failed to get container status \"93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22\": rpc error: code = NotFound desc = could not find container \"93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22\": container with ID starting with 93c55faa0fe3f4f956b86839927589f305f03299532904bad0ae1a11d33a4d22 not found: ID does not exist" Apr 21 04:20:46.153346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.153336 2575 scope.go:117] "RemoveContainer" containerID="a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1" Apr 21 04:20:46.153579 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:20:46.153564 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1\": container with ID starting with a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1 not found: ID does not exist" containerID="a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1" Apr 21 04:20:46.153619 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.153582 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1"} err="failed to get container status \"a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1\": rpc error: code = NotFound desc = could not find container \"a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1\": container with ID starting with a81caa37c4934ce4eff84b8f07e97f5918b953b1b0ae16577503b76e9ca8bde1 not found: ID does not exist" Apr 21 04:20:46.153619 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.153595 2575 scope.go:117] "RemoveContainer" containerID="0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775" Apr 21 04:20:46.153860 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:20:46.153845 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775\": container with ID starting with 0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775 not found: ID does not exist" containerID="0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775" Apr 21 04:20:46.153911 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.153862 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775"} err="failed to get container status \"0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775\": rpc error: code = NotFound desc = could not find container \"0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775\": container with ID starting with 0a9a4de8f9a0c6e1d555737a1bb1952dbedda68d9cb3e6d94bee15e1c430e775 not found: ID does not exist" Apr 21 04:20:46.158353 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:46.158331 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-2hl49"] Apr 21 04:20:47.028062 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:47.028019 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" path="/var/lib/kubelet/pods/ce7e5ad3-0b4d-4e0c-a300-847809b932c9/volumes" Apr 21 04:20:48.138432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:48.138404 2575 generic.go:358] "Generic (PLEG): container finished" podID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerID="616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa" exitCode=0 Apr 21 04:20:48.138719 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:48.138480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerDied","Data":"616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa"} Apr 21 04:20:49.143310 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:49.143269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerStarted","Data":"e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712"} Apr 21 04:20:49.143662 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:49.143317 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerStarted","Data":"45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b"} Apr 21 04:20:49.143662 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:49.143633 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:49.143828 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:49.143783 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:49.144967 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:49.144941 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 21 04:20:49.161217 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:49.161176 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podStartSLOduration=6.16116469 podStartE2EDuration="6.16116469s" podCreationTimestamp="2026-04-21 04:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:20:49.160666794 +0000 UTC m=+1434.825256882" watchObservedRunningTime="2026-04-21 04:20:49.16116469 +0000 UTC m=+1434.825754776" Apr 21 04:20:50.147087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:50.147053 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 21 04:20:55.151984 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:55.151955 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:20:55.152592 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:20:55.152565 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 21 04:21:05.152594 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:05.152557 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 21 04:21:15.152866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:15.152819 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 21 04:21:25.152960 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:25.152920 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 21 04:21:35.154037 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:35.154003 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:21:45.090786 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.090730 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5"] Apr 21 04:21:45.091261 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.091072 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" containerID="cri-o://45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b" gracePeriod=30 Apr 21 04:21:45.091261 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.091113 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kube-rbac-proxy" containerID="cri-o://e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712" gracePeriod=30 Apr 21 04:21:45.148188 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.148150 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 21 04:21:45.152459 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.152435 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 21 04:21:45.175574 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.175547 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq"] Apr 21 04:21:45.175921 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.175904 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="storage-initializer" Apr 21 04:21:45.176031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.175923 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="storage-initializer" Apr 21 04:21:45.176031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.175935 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" Apr 21 04:21:45.176031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.175943 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" Apr 21 04:21:45.176031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.175953 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kube-rbac-proxy" Apr 21 04:21:45.176031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.175960 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kube-rbac-proxy" Apr 21 04:21:45.176319 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.176034 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kube-rbac-proxy" Apr 21 04:21:45.176319 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.176046 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce7e5ad3-0b4d-4e0c-a300-847809b932c9" containerName="kserve-container" Apr 21 04:21:45.179082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.179063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.181408 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.181385 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 21 04:21:45.181796 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.181385 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 21 04:21:45.186391 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.186345 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq"] Apr 21 04:21:45.270175 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.270143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.270305 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.270187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfxc\" (UniqueName: \"kubernetes.io/projected/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kube-api-access-vwfxc\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.270305 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.270252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.270412 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.270340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.292338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.292312 2575 generic.go:358] "Generic (PLEG): container finished" podID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerID="e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712" exitCode=2 Apr 21 04:21:45.292426 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.292384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerDied","Data":"e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712"} Apr 21 04:21:45.371046 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.371025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.371146 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.371060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.371146 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.371086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfxc\" (UniqueName: \"kubernetes.io/projected/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kube-api-access-vwfxc\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.371146 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.371122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.371266 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:21:45.371191 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-predictor-serving-cert: secret "isvc-pmml-predictor-serving-cert" not found Apr 21 04:21:45.371266 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:21:45.371250 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls podName:84d324b3-1eb6-4f44-b7c4-7037aaa6d75d nodeName:}" failed. No retries permitted until 2026-04-21 04:21:45.871228549 +0000 UTC m=+1491.535818618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls") pod "isvc-pmml-predictor-8bb578669-kk7lq" (UID: "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d") : secret "isvc-pmml-predictor-serving-cert" not found Apr 21 04:21:45.371521 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.371504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.371652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.371632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.379817 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.379795 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfxc\" (UniqueName: \"kubernetes.io/projected/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kube-api-access-vwfxc\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.875076 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.875044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:45.877578 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:45.877545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-kk7lq\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:46.091404 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:46.091369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:46.215102 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:46.215068 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq"] Apr 21 04:21:46.218012 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:21:46.217975 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d324b3_1eb6_4f44_b7c4_7037aaa6d75d.slice/crio-c552da284f9fb02933ad95e1dd82416d34cbc112025fc5b41b753a547ba0ea5f WatchSource:0}: Error finding container c552da284f9fb02933ad95e1dd82416d34cbc112025fc5b41b753a547ba0ea5f: Status 404 returned error can't find the container with id c552da284f9fb02933ad95e1dd82416d34cbc112025fc5b41b753a547ba0ea5f Apr 21 04:21:46.296782 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:46.296736 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerStarted","Data":"32b87ad3b978843fabea65eb5479627d73ca3f5c55cea874459bb67baef4be2c"} Apr 21 04:21:46.296880 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:46.296796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerStarted","Data":"c552da284f9fb02933ad95e1dd82416d34cbc112025fc5b41b753a547ba0ea5f"} Apr 21 04:21:47.834879 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.834857 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:21:47.992595 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.992520 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47536376-5ab9-4bbe-9f8b-6bae72ec4106-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " Apr 21 04:21:47.992595 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.992567 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kserve-provision-location\") pod \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " Apr 21 04:21:47.992751 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.992600 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47536376-5ab9-4bbe-9f8b-6bae72ec4106-proxy-tls\") pod \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " Apr 21 04:21:47.992751 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.992650 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szc2f\" (UniqueName: \"kubernetes.io/projected/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kube-api-access-szc2f\") pod \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\" (UID: \"47536376-5ab9-4bbe-9f8b-6bae72ec4106\") " Apr 21 04:21:47.992965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.992935 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47536376-5ab9-4bbe-9f8b-6bae72ec4106-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "47536376-5ab9-4bbe-9f8b-6bae72ec4106" (UID: "47536376-5ab9-4bbe-9f8b-6bae72ec4106"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:21:47.994885 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.994857 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kube-api-access-szc2f" (OuterVolumeSpecName: "kube-api-access-szc2f") pod "47536376-5ab9-4bbe-9f8b-6bae72ec4106" (UID: "47536376-5ab9-4bbe-9f8b-6bae72ec4106"). InnerVolumeSpecName "kube-api-access-szc2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:21:47.994885 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:47.994868 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47536376-5ab9-4bbe-9f8b-6bae72ec4106-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "47536376-5ab9-4bbe-9f8b-6bae72ec4106" (UID: "47536376-5ab9-4bbe-9f8b-6bae72ec4106"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:21:48.002636 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.002612 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "47536376-5ab9-4bbe-9f8b-6bae72ec4106" (UID: "47536376-5ab9-4bbe-9f8b-6bae72ec4106"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:21:48.093643 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.093619 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47536376-5ab9-4bbe-9f8b-6bae72ec4106-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:21:48.093643 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.093643 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:21:48.093798 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.093657 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47536376-5ab9-4bbe-9f8b-6bae72ec4106-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:21:48.093798 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.093674 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szc2f\" (UniqueName: \"kubernetes.io/projected/47536376-5ab9-4bbe-9f8b-6bae72ec4106-kube-api-access-szc2f\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:21:48.303290 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.303232 2575 generic.go:358] "Generic (PLEG): container finished" podID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerID="45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b" exitCode=0 Apr 21 04:21:48.303403 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.303308 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" Apr 21 04:21:48.303403 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.303314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerDied","Data":"45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b"} Apr 21 04:21:48.303403 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.303357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5" event={"ID":"47536376-5ab9-4bbe-9f8b-6bae72ec4106","Type":"ContainerDied","Data":"37415ff5922f0ccd3e0d9e06b2eb28269e897f4b0968a67a241fadadb7497c6c"} Apr 21 04:21:48.303403 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.303377 2575 scope.go:117] "RemoveContainer" containerID="e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712" Apr 21 04:21:48.311458 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.311438 2575 scope.go:117] "RemoveContainer" containerID="45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b" Apr 21 04:21:48.320635 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.320618 2575 scope.go:117] "RemoveContainer" containerID="616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa" Apr 21 04:21:48.323148 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.323127 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5"] Apr 21 04:21:48.326897 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.326876 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-hn6g5"] Apr 21 04:21:48.328178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.328164 2575 scope.go:117] "RemoveContainer" containerID="e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712" Apr 21 04:21:48.328401 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:21:48.328383 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712\": container with ID starting with e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712 not found: ID does not exist" containerID="e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712" Apr 21 04:21:48.328463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.328407 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712"} err="failed to get container status \"e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712\": rpc error: code = NotFound desc = could not find container \"e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712\": container with ID starting with e9f1a0220287502eff87b492475f1af5e9b92b82de4b7bcec271fc744b761712 not found: ID does not exist" Apr 21 04:21:48.328463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.328422 2575 scope.go:117] "RemoveContainer" containerID="45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b" Apr 21 04:21:48.328652 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:21:48.328634 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b\": container with ID starting with 45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b not found: ID does not exist" containerID="45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b" Apr 21 04:21:48.328701 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.328659 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b"} err="failed to get container status \"45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b\": rpc error: code = NotFound desc = could not find container \"45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b\": container with ID starting with 45a80252969ba5452fea095360d10889aa206af87e2a41f23df1bf2da5ab1d6b not found: ID does not exist" Apr 21 04:21:48.328701 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.328680 2575 scope.go:117] "RemoveContainer" containerID="616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa" Apr 21 04:21:48.328999 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:21:48.328979 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa\": container with ID starting with 616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa not found: ID does not exist" containerID="616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa" Apr 21 04:21:48.329084 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:48.329003 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa"} err="failed to get container status \"616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa\": rpc error: code = NotFound desc = could not find container \"616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa\": container with ID starting with 616e28ca4c8d7cbd2310029e21041c0ad0c85cabd0bb53d1d6dd466c26ca4efa not found: ID does not exist" Apr 21 04:21:49.028082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:49.028049 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" path="/var/lib/kubelet/pods/47536376-5ab9-4bbe-9f8b-6bae72ec4106/volumes" Apr 21 04:21:50.310532 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:50.310494 2575 generic.go:358] "Generic (PLEG): container finished" podID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerID="32b87ad3b978843fabea65eb5479627d73ca3f5c55cea874459bb67baef4be2c" exitCode=0 Apr 21 04:21:50.310532 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:50.310525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerDied","Data":"32b87ad3b978843fabea65eb5479627d73ca3f5c55cea874459bb67baef4be2c"} Apr 21 04:21:57.333700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:57.333676 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerStarted","Data":"8a5926db9b75a9b7416c1b452c7ca2e9e9f0877ef8fd7e14e76b2ba93874c556"} Apr 21 04:21:58.338115 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:58.338078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerStarted","Data":"3a949d3abd772010f324c2ea5f25ea559f8e73382e8841e6e68d3185ee7cb2f6"} Apr 21 04:21:58.338476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:58.338251 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:58.355605 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:58.355557 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podStartSLOduration=6.428016912 podStartE2EDuration="13.355545832s" podCreationTimestamp="2026-04-21 04:21:45 +0000 UTC" firstStartedPulling="2026-04-21 04:21:50.31181542 +0000 UTC m=+1495.976405484" lastFinishedPulling="2026-04-21 04:21:57.239344341 +0000 UTC m=+1502.903934404" observedRunningTime="2026-04-21 04:21:58.354597155 +0000 UTC m=+1504.019187240" watchObservedRunningTime="2026-04-21 04:21:58.355545832 +0000 UTC m=+1504.020135918" Apr 21 04:21:59.341454 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:59.341414 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:21:59.342344 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:21:59.342313 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:22:00.344159 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:00.344117 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:22:05.349509 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:05.349480 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:22:05.350005 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:05.349980 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:22:15.349968 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:15.349880 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:22:25.350445 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:25.350389 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:22:35.350424 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:35.350386 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:22:45.350743 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:45.350699 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:22:55.350128 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:22:55.350089 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:23:05.350785 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:05.350731 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 21 04:23:15.350866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:15.350830 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:23:26.206487 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.206455 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq"] Apr 21 04:23:26.206999 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.206786 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" containerID="cri-o://8a5926db9b75a9b7416c1b452c7ca2e9e9f0877ef8fd7e14e76b2ba93874c556" gracePeriod=30 Apr 21 04:23:26.206999 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.206831 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kube-rbac-proxy" containerID="cri-o://3a949d3abd772010f324c2ea5f25ea559f8e73382e8841e6e68d3185ee7cb2f6" gracePeriod=30 Apr 21 04:23:26.301981 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.301951 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2"] Apr 21 04:23:26.302226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302215 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="storage-initializer" Apr 21 04:23:26.302275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302228 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="storage-initializer" Apr 21 04:23:26.302275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302245 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" Apr 21 04:23:26.302275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302251 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" Apr 21 04:23:26.302275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302258 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kube-rbac-proxy" Apr 21 04:23:26.302275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302264 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kube-rbac-proxy" Apr 21 04:23:26.302427 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302336 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kube-rbac-proxy" Apr 21 04:23:26.302427 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.302358 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="47536376-5ab9-4bbe-9f8b-6bae72ec4106" containerName="kserve-container" Apr 21 04:23:26.305463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.305442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.307772 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.307737 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:23:26.308016 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.307993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 21 04:23:26.314193 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.314172 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2"] Apr 21 04:23:26.396634 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.396607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.396739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.396637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6r9n\" (UniqueName: \"kubernetes.io/projected/1b71838c-762e-40e2-b007-9bcf6195056d-kube-api-access-m6r9n\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.396739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.396677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b71838c-762e-40e2-b007-9bcf6195056d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.396841 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.396798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b71838c-762e-40e2-b007-9bcf6195056d-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.497257 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.497198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b71838c-762e-40e2-b007-9bcf6195056d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.497257 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.497249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b71838c-762e-40e2-b007-9bcf6195056d-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.497405 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.497362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.497461 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.497402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6r9n\" (UniqueName: \"kubernetes.io/projected/1b71838c-762e-40e2-b007-9bcf6195056d-kube-api-access-m6r9n\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.497556 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:23:26.497535 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-runtime-predictor-serving-cert: secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 21 04:23:26.497809 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.497559 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b71838c-762e-40e2-b007-9bcf6195056d-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.497809 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:23:26.497623 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls podName:1b71838c-762e-40e2-b007-9bcf6195056d nodeName:}" failed. No retries permitted until 2026-04-21 04:23:26.997599351 +0000 UTC m=+1592.662189434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls") pod "isvc-pmml-runtime-predictor-67bc544947-n7mg2" (UID: "1b71838c-762e-40e2-b007-9bcf6195056d") : secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 21 04:23:26.497966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.497867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b71838c-762e-40e2-b007-9bcf6195056d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.506578 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.506556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6r9n\" (UniqueName: \"kubernetes.io/projected/1b71838c-762e-40e2-b007-9bcf6195056d-kube-api-access-m6r9n\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:26.575358 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.575334 2575 generic.go:358] "Generic (PLEG): container finished" podID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerID="3a949d3abd772010f324c2ea5f25ea559f8e73382e8841e6e68d3185ee7cb2f6" exitCode=2 Apr 21 04:23:26.575477 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:26.575414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerDied","Data":"3a949d3abd772010f324c2ea5f25ea559f8e73382e8841e6e68d3185ee7cb2f6"} Apr 21 04:23:27.000632 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:27.000591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:27.003029 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:27.003007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-n7mg2\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:27.216402 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:27.216373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:27.331607 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:27.331582 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2"] Apr 21 04:23:27.334419 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:23:27.334394 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b71838c_762e_40e2_b007_9bcf6195056d.slice/crio-21ce0e8b6082e1f6e412ae443e777b17e21af12ca5b61ae03d83b435205e5ba0 WatchSource:0}: Error finding container 21ce0e8b6082e1f6e412ae443e777b17e21af12ca5b61ae03d83b435205e5ba0: Status 404 returned error can't find the container with id 21ce0e8b6082e1f6e412ae443e777b17e21af12ca5b61ae03d83b435205e5ba0 Apr 21 04:23:27.579907 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:27.579815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerStarted","Data":"b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc"} Apr 21 04:23:27.579907 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:27.579852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerStarted","Data":"21ce0e8b6082e1f6e412ae443e777b17e21af12ca5b61ae03d83b435205e5ba0"} Apr 21 04:23:29.588077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.588047 2575 generic.go:358] "Generic (PLEG): container finished" podID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerID="8a5926db9b75a9b7416c1b452c7ca2e9e9f0877ef8fd7e14e76b2ba93874c556" exitCode=0 Apr 21 04:23:29.588460 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.588125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerDied","Data":"8a5926db9b75a9b7416c1b452c7ca2e9e9f0877ef8fd7e14e76b2ba93874c556"} Apr 21 04:23:29.648995 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.648971 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:23:29.824993 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.824929 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwfxc\" (UniqueName: \"kubernetes.io/projected/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kube-api-access-vwfxc\") pod \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " Apr 21 04:23:29.825116 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.824994 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " Apr 21 04:23:29.825181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.825161 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kserve-provision-location\") pod \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " Apr 21 04:23:29.825223 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.825202 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls\") pod \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\" (UID: \"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d\") " Apr 21 04:23:29.825267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.825248 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" (UID: "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:23:29.825426 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.825406 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:23:29.825506 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.825431 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" (UID: "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:23:29.826962 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.826936 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kube-api-access-vwfxc" (OuterVolumeSpecName: "kube-api-access-vwfxc") pod "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" (UID: "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d"). InnerVolumeSpecName "kube-api-access-vwfxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:23:29.827194 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.827177 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" (UID: "84d324b3-1eb6-4f44-b7c4-7037aaa6d75d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:23:29.926236 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.926211 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:23:29.926236 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.926233 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:23:29.926372 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:29.926243 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwfxc\" (UniqueName: \"kubernetes.io/projected/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d-kube-api-access-vwfxc\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:23:30.592734 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:30.592709 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" Apr 21 04:23:30.593213 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:30.592704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq" event={"ID":"84d324b3-1eb6-4f44-b7c4-7037aaa6d75d","Type":"ContainerDied","Data":"c552da284f9fb02933ad95e1dd82416d34cbc112025fc5b41b753a547ba0ea5f"} Apr 21 04:23:30.593213 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:30.592799 2575 scope.go:117] "RemoveContainer" containerID="3a949d3abd772010f324c2ea5f25ea559f8e73382e8841e6e68d3185ee7cb2f6" Apr 21 04:23:30.600724 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:30.600707 2575 scope.go:117] "RemoveContainer" containerID="8a5926db9b75a9b7416c1b452c7ca2e9e9f0877ef8fd7e14e76b2ba93874c556" Apr 21 04:23:30.607322 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:30.607302 2575 scope.go:117] "RemoveContainer" containerID="32b87ad3b978843fabea65eb5479627d73ca3f5c55cea874459bb67baef4be2c" Apr 21 04:23:30.613066 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:30.613044 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq"] Apr 21 04:23:30.616427 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:30.616405 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-kk7lq"] Apr 21 04:23:31.027870 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:31.027837 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" path="/var/lib/kubelet/pods/84d324b3-1eb6-4f44-b7c4-7037aaa6d75d/volumes" Apr 21 04:23:31.596298 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:31.596268 2575 generic.go:358] "Generic (PLEG): container finished" podID="1b71838c-762e-40e2-b007-9bcf6195056d" containerID="b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc" exitCode=0 Apr 21 04:23:31.596703 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:31.596345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerDied","Data":"b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc"} Apr 21 04:23:32.601843 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:32.601806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerStarted","Data":"b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2"} Apr 21 04:23:32.602221 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:32.601854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerStarted","Data":"4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c"} Apr 21 04:23:32.602221 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:32.602149 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:32.602344 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:32.602277 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:32.603474 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:32.603448 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:23:32.619376 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:32.619336 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podStartSLOduration=6.619324822 podStartE2EDuration="6.619324822s" podCreationTimestamp="2026-04-21 04:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:23:32.61801733 +0000 UTC m=+1598.282607416" watchObservedRunningTime="2026-04-21 04:23:32.619324822 +0000 UTC m=+1598.283914908" Apr 21 04:23:33.604923 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:33.604885 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:23:38.608886 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:38.608855 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:23:38.609280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:38.609255 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:23:48.610010 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:48.609929 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:23:58.609705 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:23:58.609666 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:24:08.609752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:08.609664 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:24:18.609960 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:18.609918 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:24:28.609661 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:28.609618 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:24:38.609711 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:38.609671 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:24:48.610704 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:48.610678 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:24:57.387565 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.387535 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2"] Apr 21 04:24:57.388039 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.387886 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" containerID="cri-o://4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c" gracePeriod=30 Apr 21 04:24:57.388039 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.387928 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kube-rbac-proxy" containerID="cri-o://b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2" gracePeriod=30 Apr 21 04:24:57.497520 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497496 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t"] Apr 21 04:24:57.497783 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497770 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kube-rbac-proxy" Apr 21 04:24:57.497833 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497786 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kube-rbac-proxy" Apr 21 04:24:57.497833 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497805 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" Apr 21 04:24:57.497833 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497812 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" Apr 21 04:24:57.497833 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497820 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="storage-initializer" Apr 21 04:24:57.497833 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497826 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="storage-initializer" Apr 21 04:24:57.498000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497874 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kserve-container" Apr 21 04:24:57.498000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.497889 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="84d324b3-1eb6-4f44-b7c4-7037aaa6d75d" containerName="kube-rbac-proxy" Apr 21 04:24:57.500910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.500894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.503703 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.503677 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 21 04:24:57.503703 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.503688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 21 04:24:57.510787 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.510752 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t"] Apr 21 04:24:57.615690 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.615663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1704c63c-f09f-4a54-8500-b521de42d33a-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.615858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.615743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1704c63c-f09f-4a54-8500-b521de42d33a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.615858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.615800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1704c63c-f09f-4a54-8500-b521de42d33a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.615858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.615841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj2rj\" (UniqueName: \"kubernetes.io/projected/1704c63c-f09f-4a54-8500-b521de42d33a-kube-api-access-nj2rj\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.716687 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.716619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1704c63c-f09f-4a54-8500-b521de42d33a-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.716687 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.716660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1704c63c-f09f-4a54-8500-b521de42d33a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.716687 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.716680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1704c63c-f09f-4a54-8500-b521de42d33a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.716940 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.716707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj2rj\" (UniqueName: \"kubernetes.io/projected/1704c63c-f09f-4a54-8500-b521de42d33a-kube-api-access-nj2rj\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.717118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.717094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1704c63c-f09f-4a54-8500-b521de42d33a-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.717361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.717342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1704c63c-f09f-4a54-8500-b521de42d33a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.719040 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.719018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1704c63c-f09f-4a54-8500-b521de42d33a-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.724103 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.724082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj2rj\" (UniqueName: \"kubernetes.io/projected/1704c63c-f09f-4a54-8500-b521de42d33a-kube-api-access-nj2rj\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.810583 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.810558 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:24:57.827527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.827505 2575 generic.go:358] "Generic (PLEG): container finished" podID="1b71838c-762e-40e2-b007-9bcf6195056d" containerID="b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2" exitCode=2 Apr 21 04:24:57.827620 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.827574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerDied","Data":"b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2"} Apr 21 04:24:57.925381 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:57.925359 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t"] Apr 21 04:24:57.927478 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:24:57.927452 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1704c63c_f09f_4a54_8500_b521de42d33a.slice/crio-3a20d19011c455d2221c5abb758a1c9982e16bd231f219c168cb1e141210d926 WatchSource:0}: Error finding container 3a20d19011c455d2221c5abb758a1c9982e16bd231f219c168cb1e141210d926: Status 404 returned error can't find the container with id 3a20d19011c455d2221c5abb758a1c9982e16bd231f219c168cb1e141210d926 Apr 21 04:24:58.605431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:58.605391 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.35:8643/healthz\": dial tcp 10.132.0.35:8643: connect: connection refused" Apr 21 04:24:58.609702 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:58.609670 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 21 04:24:58.832570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:58.832536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerStarted","Data":"b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7"} Apr 21 04:24:58.832775 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:24:58.832578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerStarted","Data":"3a20d19011c455d2221c5abb758a1c9982e16bd231f219c168cb1e141210d926"} Apr 21 04:25:00.518517 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.518495 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:25:00.640846 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.640825 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6r9n\" (UniqueName: \"kubernetes.io/projected/1b71838c-762e-40e2-b007-9bcf6195056d-kube-api-access-m6r9n\") pod \"1b71838c-762e-40e2-b007-9bcf6195056d\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " Apr 21 04:25:00.640957 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.640856 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b71838c-762e-40e2-b007-9bcf6195056d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"1b71838c-762e-40e2-b007-9bcf6195056d\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " Apr 21 04:25:00.640957 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.640889 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b71838c-762e-40e2-b007-9bcf6195056d-kserve-provision-location\") pod \"1b71838c-762e-40e2-b007-9bcf6195056d\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " Apr 21 04:25:00.640957 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.640915 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls\") pod \"1b71838c-762e-40e2-b007-9bcf6195056d\" (UID: \"1b71838c-762e-40e2-b007-9bcf6195056d\") " Apr 21 04:25:00.641167 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.641141 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b71838c-762e-40e2-b007-9bcf6195056d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b71838c-762e-40e2-b007-9bcf6195056d" (UID: "1b71838c-762e-40e2-b007-9bcf6195056d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:25:00.641270 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.641244 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b71838c-762e-40e2-b007-9bcf6195056d-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "1b71838c-762e-40e2-b007-9bcf6195056d" (UID: "1b71838c-762e-40e2-b007-9bcf6195056d"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:25:00.642881 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.642858 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b71838c-762e-40e2-b007-9bcf6195056d-kube-api-access-m6r9n" (OuterVolumeSpecName: "kube-api-access-m6r9n") pod "1b71838c-762e-40e2-b007-9bcf6195056d" (UID: "1b71838c-762e-40e2-b007-9bcf6195056d"). InnerVolumeSpecName "kube-api-access-m6r9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:25:00.643022 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.643002 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b71838c-762e-40e2-b007-9bcf6195056d" (UID: "1b71838c-762e-40e2-b007-9bcf6195056d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:25:00.741456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.741422 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6r9n\" (UniqueName: \"kubernetes.io/projected/1b71838c-762e-40e2-b007-9bcf6195056d-kube-api-access-m6r9n\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:25:00.741456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.741453 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b71838c-762e-40e2-b007-9bcf6195056d-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:25:00.741456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.741463 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b71838c-762e-40e2-b007-9bcf6195056d-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:25:00.741625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.741473 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b71838c-762e-40e2-b007-9bcf6195056d-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:25:00.840697 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.840670 2575 generic.go:358] "Generic (PLEG): container finished" podID="1b71838c-762e-40e2-b007-9bcf6195056d" containerID="4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c" exitCode=0 Apr 21 04:25:00.840837 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.840746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerDied","Data":"4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c"} Apr 21 04:25:00.840837 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.840787 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" Apr 21 04:25:00.840837 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.840795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2" event={"ID":"1b71838c-762e-40e2-b007-9bcf6195056d","Type":"ContainerDied","Data":"21ce0e8b6082e1f6e412ae443e777b17e21af12ca5b61ae03d83b435205e5ba0"} Apr 21 04:25:00.840837 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.840815 2575 scope.go:117] "RemoveContainer" containerID="b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2" Apr 21 04:25:00.848596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.848454 2575 scope.go:117] "RemoveContainer" containerID="4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c" Apr 21 04:25:00.855259 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.855243 2575 scope.go:117] "RemoveContainer" containerID="b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc" Apr 21 04:25:00.861656 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.861633 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2"] Apr 21 04:25:00.862169 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.862156 2575 scope.go:117] "RemoveContainer" containerID="b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2" Apr 21 04:25:00.862425 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:25:00.862398 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2\": container with ID starting with b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2 not found: ID does not exist" containerID="b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2" Apr 21 04:25:00.862485 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.862423 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2"} err="failed to get container status \"b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2\": rpc error: code = NotFound desc = could not find container \"b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2\": container with ID starting with b73e4381c590ddce2a62232ce72a1a8bf6084f616c651d1319cfbdbb0f3881a2 not found: ID does not exist" Apr 21 04:25:00.862485 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.862440 2575 scope.go:117] "RemoveContainer" containerID="4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c" Apr 21 04:25:00.862686 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:25:00.862669 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c\": container with ID starting with 4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c not found: ID does not exist" containerID="4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c" Apr 21 04:25:00.862740 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.862692 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c"} err="failed to get container status \"4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c\": rpc error: code = NotFound desc = could not find container \"4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c\": container with ID starting with 4a46e8941e664edd8c59cd93f43f29e41308c613b14271e7e195844f0ae41e5c not found: ID does not exist" Apr 21 04:25:00.862740 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.862710 2575 scope.go:117] "RemoveContainer" containerID="b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc" Apr 21 04:25:00.863026 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:25:00.863006 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc\": container with ID starting with b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc not found: ID does not exist" containerID="b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc" Apr 21 04:25:00.863082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.863050 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc"} err="failed to get container status \"b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc\": rpc error: code = NotFound desc = could not find container \"b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc\": container with ID starting with b5e9e6c85945382a342737c23cfd05c21ee0bd3ed3488bf8e8c22e105cf0c6bc not found: ID does not exist" Apr 21 04:25:00.864972 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:00.864952 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-n7mg2"] Apr 21 04:25:01.027290 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:01.027216 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" path="/var/lib/kubelet/pods/1b71838c-762e-40e2-b007-9bcf6195056d/volumes" Apr 21 04:25:01.844966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:01.844937 2575 generic.go:358] "Generic (PLEG): container finished" podID="1704c63c-f09f-4a54-8500-b521de42d33a" containerID="b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7" exitCode=0 Apr 21 04:25:01.845350 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:01.845020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerDied","Data":"b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7"} Apr 21 04:25:02.850834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:02.850799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerStarted","Data":"571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d"} Apr 21 04:25:02.850834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:02.850839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerStarted","Data":"36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c"} Apr 21 04:25:02.851355 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:02.851064 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:25:02.869343 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:02.869296 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podStartSLOduration=5.869280485 podStartE2EDuration="5.869280485s" podCreationTimestamp="2026-04-21 04:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:25:02.867161789 +0000 UTC m=+1688.531751875" watchObservedRunningTime="2026-04-21 04:25:02.869280485 +0000 UTC m=+1688.533870572" Apr 21 04:25:03.854214 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:03.854180 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:25:03.855431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:03.855394 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:25:04.857001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:04.856964 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:25:09.861497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:09.861471 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:25:09.861890 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:09.861852 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:25:19.862451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:19.862366 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:25:29.862183 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:29.862146 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:25:39.862396 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:39.862353 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:25:49.862729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:49.862690 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:25:59.861916 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:25:59.861869 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:26:09.861972 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:09.861929 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:26:19.862549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:19.862519 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:26:28.613535 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.613491 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t"] Apr 21 04:26:28.614091 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.613878 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" containerID="cri-o://36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c" gracePeriod=30 Apr 21 04:26:28.614091 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.613942 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kube-rbac-proxy" containerID="cri-o://571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d" gracePeriod=30 Apr 21 04:26:28.709119 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709077 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc"] Apr 21 04:26:28.709392 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709380 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kube-rbac-proxy" Apr 21 04:26:28.709451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709394 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kube-rbac-proxy" Apr 21 04:26:28.709451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709405 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="storage-initializer" Apr 21 04:26:28.709451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709411 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="storage-initializer" Apr 21 04:26:28.709451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709426 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" Apr 21 04:26:28.709451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709432 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" Apr 21 04:26:28.709619 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709482 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kube-rbac-proxy" Apr 21 04:26:28.709619 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.709490 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b71838c-762e-40e2-b007-9bcf6195056d" containerName="kserve-container" Apr 21 04:26:28.712415 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.712398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.715044 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.715024 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-fe3cee-predictor-serving-cert\"" Apr 21 04:26:28.715141 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.715026 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-fe3cee-kube-rbac-proxy-sar-config\"" Apr 21 04:26:28.724135 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.724114 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc"] Apr 21 04:26:28.761366 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.761333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-proxy-tls\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.761523 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.761394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5b99\" (UniqueName: \"kubernetes.io/projected/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kube-api-access-m5b99\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.761523 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.761460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kserve-provision-location\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.761612 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.761523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-isvc-primary-fe3cee-kube-rbac-proxy-sar-config\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.862748 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.862716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5b99\" (UniqueName: \"kubernetes.io/projected/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kube-api-access-m5b99\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.862931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.862776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kserve-provision-location\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.862931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.862813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-isvc-primary-fe3cee-kube-rbac-proxy-sar-config\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.862931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.862835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-proxy-tls\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.863249 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.863224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kserve-provision-location\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.863476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.863457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-isvc-primary-fe3cee-kube-rbac-proxy-sar-config\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.865454 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.865405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-proxy-tls\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:28.872296 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:28.872273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5b99\" (UniqueName: \"kubernetes.io/projected/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kube-api-access-m5b99\") pod \"isvc-primary-fe3cee-predictor-68445845fd-796xc\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:29.022390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:29.022352 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:29.087840 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:29.087792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerDied","Data":"571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d"} Apr 21 04:26:29.087840 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:29.087800 2575 generic.go:358] "Generic (PLEG): container finished" podID="1704c63c-f09f-4a54-8500-b521de42d33a" containerID="571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d" exitCode=2 Apr 21 04:26:29.145668 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:29.145635 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc"] Apr 21 04:26:29.148688 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:26:29.148661 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeac5670_9986_4fe1_8ee5_a4bf924fbb2e.slice/crio-4a772d708ae470d4b607b57d1a45c07c5b63a43a63079c109c6102558f0e062a WatchSource:0}: Error finding container 4a772d708ae470d4b607b57d1a45c07c5b63a43a63079c109c6102558f0e062a: Status 404 returned error can't find the container with id 4a772d708ae470d4b607b57d1a45c07c5b63a43a63079c109c6102558f0e062a Apr 21 04:26:29.150400 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:29.150385 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:26:29.858217 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:29.858177 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 21 04:26:29.862563 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:29.862534 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 21 04:26:30.092025 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:30.091988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerStarted","Data":"b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b"} Apr 21 04:26:30.092025 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:30.092027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerStarted","Data":"4a772d708ae470d4b607b57d1a45c07c5b63a43a63079c109c6102558f0e062a"} Apr 21 04:26:32.256374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.256353 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:26:32.390097 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.390071 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1704c63c-f09f-4a54-8500-b521de42d33a-proxy-tls\") pod \"1704c63c-f09f-4a54-8500-b521de42d33a\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " Apr 21 04:26:32.390280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.390132 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1704c63c-f09f-4a54-8500-b521de42d33a-kserve-provision-location\") pod \"1704c63c-f09f-4a54-8500-b521de42d33a\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " Apr 21 04:26:32.390280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.390162 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1704c63c-f09f-4a54-8500-b521de42d33a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"1704c63c-f09f-4a54-8500-b521de42d33a\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " Apr 21 04:26:32.390280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.390198 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj2rj\" (UniqueName: \"kubernetes.io/projected/1704c63c-f09f-4a54-8500-b521de42d33a-kube-api-access-nj2rj\") pod \"1704c63c-f09f-4a54-8500-b521de42d33a\" (UID: \"1704c63c-f09f-4a54-8500-b521de42d33a\") " Apr 21 04:26:32.390496 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.390469 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1704c63c-f09f-4a54-8500-b521de42d33a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1704c63c-f09f-4a54-8500-b521de42d33a" (UID: "1704c63c-f09f-4a54-8500-b521de42d33a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:26:32.390553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.390512 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1704c63c-f09f-4a54-8500-b521de42d33a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "1704c63c-f09f-4a54-8500-b521de42d33a" (UID: "1704c63c-f09f-4a54-8500-b521de42d33a"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:26:32.392225 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.392195 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1704c63c-f09f-4a54-8500-b521de42d33a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1704c63c-f09f-4a54-8500-b521de42d33a" (UID: "1704c63c-f09f-4a54-8500-b521de42d33a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:26:32.392329 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.392225 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1704c63c-f09f-4a54-8500-b521de42d33a-kube-api-access-nj2rj" (OuterVolumeSpecName: "kube-api-access-nj2rj") pod "1704c63c-f09f-4a54-8500-b521de42d33a" (UID: "1704c63c-f09f-4a54-8500-b521de42d33a"). InnerVolumeSpecName "kube-api-access-nj2rj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:26:32.491426 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.491386 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1704c63c-f09f-4a54-8500-b521de42d33a-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:26:32.491426 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.491420 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1704c63c-f09f-4a54-8500-b521de42d33a-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:26:32.491426 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.491432 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1704c63c-f09f-4a54-8500-b521de42d33a-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:26:32.491664 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:32.491442 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nj2rj\" (UniqueName: \"kubernetes.io/projected/1704c63c-f09f-4a54-8500-b521de42d33a-kube-api-access-nj2rj\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:26:33.100891 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.100857 2575 generic.go:358] "Generic (PLEG): container finished" podID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerID="b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b" exitCode=0 Apr 21 04:26:33.101086 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.100936 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerDied","Data":"b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b"} Apr 21 04:26:33.102698 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.102677 2575 generic.go:358] "Generic (PLEG): container finished" podID="1704c63c-f09f-4a54-8500-b521de42d33a" containerID="36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c" exitCode=0 Apr 21 04:26:33.102821 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.102740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerDied","Data":"36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c"} Apr 21 04:26:33.102821 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.102780 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" event={"ID":"1704c63c-f09f-4a54-8500-b521de42d33a","Type":"ContainerDied","Data":"3a20d19011c455d2221c5abb758a1c9982e16bd231f219c168cb1e141210d926"} Apr 21 04:26:33.102821 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.102784 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t" Apr 21 04:26:33.102821 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.102795 2575 scope.go:117] "RemoveContainer" containerID="571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d" Apr 21 04:26:33.110443 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.110425 2575 scope.go:117] "RemoveContainer" containerID="36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c" Apr 21 04:26:33.117181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.117164 2575 scope.go:117] "RemoveContainer" containerID="b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7" Apr 21 04:26:33.127980 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.127752 2575 scope.go:117] "RemoveContainer" containerID="571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d" Apr 21 04:26:33.128088 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:26:33.128061 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d\": container with ID starting with 571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d not found: ID does not exist" containerID="571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d" Apr 21 04:26:33.128131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.128099 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d"} err="failed to get container status \"571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d\": rpc error: code = NotFound desc = could not find container \"571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d\": container with ID starting with 571c175cf9d4b79d2b5234112d5e5fdb3cef52ca10f3fc29384e02653f9a862d not found: ID does not exist" Apr 21 04:26:33.128131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.128121 2575 scope.go:117] "RemoveContainer" containerID="36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c" Apr 21 04:26:33.128416 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:26:33.128384 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c\": container with ID starting with 36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c not found: ID does not exist" containerID="36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c" Apr 21 04:26:33.128465 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.128423 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c"} err="failed to get container status \"36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c\": rpc error: code = NotFound desc = could not find container \"36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c\": container with ID starting with 36cb146fbbeb69cebea6dfa2eacbf75447a48c264d6a273a6df23597f2578a0c not found: ID does not exist" Apr 21 04:26:33.128465 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.128441 2575 scope.go:117] "RemoveContainer" containerID="b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7" Apr 21 04:26:33.128695 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:26:33.128680 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7\": container with ID starting with b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7 not found: ID does not exist" containerID="b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7" Apr 21 04:26:33.128740 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.128700 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7"} err="failed to get container status \"b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7\": rpc error: code = NotFound desc = could not find container \"b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7\": container with ID starting with b39aafce7f17612a2a91fb13643fa040f86f3d8bf639ce9ffe19e8b93a1d06f7 not found: ID does not exist" Apr 21 04:26:33.131560 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.131538 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t"] Apr 21 04:26:33.137391 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:33.137373 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-fqz2t"] Apr 21 04:26:34.107649 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:34.107619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerStarted","Data":"a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e"} Apr 21 04:26:34.108345 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:34.107660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerStarted","Data":"232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d"} Apr 21 04:26:34.108345 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:34.107960 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:34.108345 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:34.108103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:34.109360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:34.109336 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:26:34.126495 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:34.126443 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podStartSLOduration=6.1264297580000004 podStartE2EDuration="6.126429758s" podCreationTimestamp="2026-04-21 04:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:26:34.124955313 +0000 UTC m=+1779.789545398" watchObservedRunningTime="2026-04-21 04:26:34.126429758 +0000 UTC m=+1779.791019904" Apr 21 04:26:35.027507 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:35.027476 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" path="/var/lib/kubelet/pods/1704c63c-f09f-4a54-8500-b521de42d33a/volumes" Apr 21 04:26:35.110883 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:35.110849 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:26:40.115825 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:40.115797 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:26:40.116361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:40.116338 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:26:50.116965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:26:50.116878 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:27:00.116866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:00.116820 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:27:10.117077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:10.117032 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:27:20.117279 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:20.117236 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:27:30.116912 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:30.116867 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 21 04:27:40.117794 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:40.117741 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:27:48.867157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867118 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk"] Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867399 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="storage-initializer" Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867410 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="storage-initializer" Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867423 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kube-rbac-proxy" Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867430 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kube-rbac-proxy" Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867447 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867481 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867525 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kube-rbac-proxy" Apr 21 04:27:48.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.867535 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1704c63c-f09f-4a54-8500-b521de42d33a" containerName="kserve-container" Apr 21 04:27:48.870548 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.870531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:48.873637 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.873611 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-fe3cee\"" Apr 21 04:27:48.873813 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.873611 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-fe3cee-dockercfg-n4kj9\"" Apr 21 04:27:48.873813 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.873611 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\"" Apr 21 04:27:48.873813 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.873725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-fe3cee-predictor-serving-cert\"" Apr 21 04:27:48.874695 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.874658 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 04:27:48.882461 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.882437 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk"] Apr 21 04:27:48.932803 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.932749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:48.933006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.932814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/873c62a2-2d37-4b57-829b-a0da752f3afe-kserve-provision-location\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:48.933006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.932837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-cabundle-cert\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:48.933006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.932915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:48.933006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:48.932950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgnm\" (UniqueName: \"kubernetes.io/projected/873c62a2-2d37-4b57-829b-a0da752f3afe-kube-api-access-5xgnm\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034129 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034306 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/873c62a2-2d37-4b57-829b-a0da752f3afe-kserve-provision-location\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034306 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-cabundle-cert\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034306 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034306 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xgnm\" (UniqueName: \"kubernetes.io/projected/873c62a2-2d37-4b57-829b-a0da752f3afe-kube-api-access-5xgnm\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034306 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:27:49.034253 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-serving-cert: secret "isvc-secondary-fe3cee-predictor-serving-cert" not found Apr 21 04:27:49.034589 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:27:49.034329 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls podName:873c62a2-2d37-4b57-829b-a0da752f3afe nodeName:}" failed. No retries permitted until 2026-04-21 04:27:49.5343071 +0000 UTC m=+1855.198897180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls") pod "isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" (UID: "873c62a2-2d37-4b57-829b-a0da752f3afe") : secret "isvc-secondary-fe3cee-predictor-serving-cert" not found Apr 21 04:27:49.034659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/873c62a2-2d37-4b57-829b-a0da752f3afe-kserve-provision-location\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034924 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-cabundle-cert\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.034971 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.034922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.042645 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.042622 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xgnm\" (UniqueName: \"kubernetes.io/projected/873c62a2-2d37-4b57-829b-a0da752f3afe-kube-api-access-5xgnm\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.538171 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.538128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.540630 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.540604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls\") pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.780332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.780283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:27:49.902555 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:49.902494 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk"] Apr 21 04:27:49.905063 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:27:49.905025 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod873c62a2_2d37_4b57_829b_a0da752f3afe.slice/crio-f5171d997a2277441d4e5173acf2ea13d00148d1a3d5ce644a0647ecc33afdb2 WatchSource:0}: Error finding container f5171d997a2277441d4e5173acf2ea13d00148d1a3d5ce644a0647ecc33afdb2: Status 404 returned error can't find the container with id f5171d997a2277441d4e5173acf2ea13d00148d1a3d5ce644a0647ecc33afdb2 Apr 21 04:27:50.320797 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:50.320740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" event={"ID":"873c62a2-2d37-4b57-829b-a0da752f3afe","Type":"ContainerStarted","Data":"59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b"} Apr 21 04:27:50.320797 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:50.320799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" event={"ID":"873c62a2-2d37-4b57-829b-a0da752f3afe","Type":"ContainerStarted","Data":"f5171d997a2277441d4e5173acf2ea13d00148d1a3d5ce644a0647ecc33afdb2"} Apr 21 04:27:54.333427 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:54.333346 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_873c62a2-2d37-4b57-829b-a0da752f3afe/storage-initializer/0.log" Apr 21 04:27:54.333427 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:54.333388 2575 generic.go:358] "Generic (PLEG): container finished" podID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerID="59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b" exitCode=1 Apr 21 04:27:54.333848 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:54.333470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" event={"ID":"873c62a2-2d37-4b57-829b-a0da752f3afe","Type":"ContainerDied","Data":"59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b"} Apr 21 04:27:55.337666 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:55.337633 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_873c62a2-2d37-4b57-829b-a0da752f3afe/storage-initializer/0.log" Apr 21 04:27:55.338186 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:55.337688 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" event={"ID":"873c62a2-2d37-4b57-829b-a0da752f3afe","Type":"ContainerStarted","Data":"8fbd68f9be2225c158572f5b0db0ea5688468a16229745a462476c717f5e77f7"} Apr 21 04:27:59.349447 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:59.349360 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_873c62a2-2d37-4b57-829b-a0da752f3afe/storage-initializer/1.log" Apr 21 04:27:59.349859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:59.349696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_873c62a2-2d37-4b57-829b-a0da752f3afe/storage-initializer/0.log" Apr 21 04:27:59.349859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:59.349726 2575 generic.go:358] "Generic (PLEG): container finished" podID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerID="8fbd68f9be2225c158572f5b0db0ea5688468a16229745a462476c717f5e77f7" exitCode=1 Apr 21 04:27:59.349859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:59.349797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" event={"ID":"873c62a2-2d37-4b57-829b-a0da752f3afe","Type":"ContainerDied","Data":"8fbd68f9be2225c158572f5b0db0ea5688468a16229745a462476c717f5e77f7"} Apr 21 04:27:59.349859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:59.349848 2575 scope.go:117] "RemoveContainer" containerID="59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b" Apr 21 04:27:59.350232 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:27:59.350212 2575 scope.go:117] "RemoveContainer" containerID="59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b" Apr 21 04:27:59.360295 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:27:59.360271 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_kserve-ci-e2e-test_873c62a2-2d37-4b57-829b-a0da752f3afe_0 in pod sandbox f5171d997a2277441d4e5173acf2ea13d00148d1a3d5ce644a0647ecc33afdb2 from index: no such id: '59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b'" containerID="59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b" Apr 21 04:27:59.360358 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:27:59.360317 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_kserve-ci-e2e-test_873c62a2-2d37-4b57-829b-a0da752f3afe_0 in pod sandbox f5171d997a2277441d4e5173acf2ea13d00148d1a3d5ce644a0647ecc33afdb2 from index: no such id: '59e0bd5d5405aaa8c8ce19433d32e2a669daccb807dad0ec1446fa4f156a767b'; Skipping pod \"isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_kserve-ci-e2e-test(873c62a2-2d37-4b57-829b-a0da752f3afe)\"" logger="UnhandledError" Apr 21 04:27:59.361637 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:27:59.361615 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_kserve-ci-e2e-test(873c62a2-2d37-4b57-829b-a0da752f3afe)\"" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" Apr 21 04:28:00.354909 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:00.354874 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_873c62a2-2d37-4b57-829b-a0da752f3afe/storage-initializer/1.log" Apr 21 04:28:04.916125 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:04.916090 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk"] Apr 21 04:28:04.972867 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:04.972835 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc"] Apr 21 04:28:04.973338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:04.973302 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" containerID="cri-o://232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d" gracePeriod=30 Apr 21 04:28:04.973480 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:04.973378 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kube-rbac-proxy" containerID="cri-o://a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e" gracePeriod=30 Apr 21 04:28:05.065170 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.065134 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs"] Apr 21 04:28:05.069813 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.069793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.072448 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.072420 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-8ab562\"" Apr 21 04:28:05.072583 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.072452 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\"" Apr 21 04:28:05.072583 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.072511 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-8ab562-predictor-serving-cert\"" Apr 21 04:28:05.072583 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.072512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-8ab562-dockercfg-j88db\"" Apr 21 04:28:05.079644 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.079622 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs"] Apr 21 04:28:05.096433 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.096404 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_873c62a2-2d37-4b57-829b-a0da752f3afe/storage-initializer/1.log" Apr 21 04:28:05.096552 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.096467 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:28:05.111714 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.111678 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 21 04:28:05.164968 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.164927 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls\") pod \"873c62a2-2d37-4b57-829b-a0da752f3afe\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " Apr 21 04:28:05.165168 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.164997 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-cabundle-cert\") pod \"873c62a2-2d37-4b57-829b-a0da752f3afe\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " Apr 21 04:28:05.165168 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165023 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\") pod \"873c62a2-2d37-4b57-829b-a0da752f3afe\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " Apr 21 04:28:05.165168 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165072 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xgnm\" (UniqueName: \"kubernetes.io/projected/873c62a2-2d37-4b57-829b-a0da752f3afe-kube-api-access-5xgnm\") pod \"873c62a2-2d37-4b57-829b-a0da752f3afe\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " Apr 21 04:28:05.165168 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165104 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/873c62a2-2d37-4b57-829b-a0da752f3afe-kserve-provision-location\") pod \"873c62a2-2d37-4b57-829b-a0da752f3afe\" (UID: \"873c62a2-2d37-4b57-829b-a0da752f3afe\") " Apr 21 04:28:05.165406 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6g6\" (UniqueName: \"kubernetes.io/projected/718b0425-c740-48fb-a8a5-16f62d7b64ba-kube-api-access-mj6g6\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.165406 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.165406 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.165406 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/718b0425-c740-48fb-a8a5-16f62d7b64ba-kserve-provision-location\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.165406 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-cabundle-cert\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.165644 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165417 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-isvc-secondary-fe3cee-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-fe3cee-kube-rbac-proxy-sar-config") pod "873c62a2-2d37-4b57-829b-a0da752f3afe" (UID: "873c62a2-2d37-4b57-829b-a0da752f3afe"). InnerVolumeSpecName "isvc-secondary-fe3cee-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:05.165644 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165438 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "873c62a2-2d37-4b57-829b-a0da752f3afe" (UID: "873c62a2-2d37-4b57-829b-a0da752f3afe"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:05.165644 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.165459 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873c62a2-2d37-4b57-829b-a0da752f3afe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "873c62a2-2d37-4b57-829b-a0da752f3afe" (UID: "873c62a2-2d37-4b57-829b-a0da752f3afe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:28:05.167314 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.167250 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873c62a2-2d37-4b57-829b-a0da752f3afe-kube-api-access-5xgnm" (OuterVolumeSpecName: "kube-api-access-5xgnm") pod "873c62a2-2d37-4b57-829b-a0da752f3afe" (UID: "873c62a2-2d37-4b57-829b-a0da752f3afe"). InnerVolumeSpecName "kube-api-access-5xgnm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:28:05.167412 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.167315 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "873c62a2-2d37-4b57-829b-a0da752f3afe" (UID: "873c62a2-2d37-4b57-829b-a0da752f3afe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:05.266149 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6g6\" (UniqueName: \"kubernetes.io/projected/718b0425-c740-48fb-a8a5-16f62d7b64ba-kube-api-access-mj6g6\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.266363 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.266363 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.266363 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:28:05.266329 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-serving-cert: secret "isvc-init-fail-8ab562-predictor-serving-cert" not found Apr 21 04:28:05.266528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/718b0425-c740-48fb-a8a5-16f62d7b64ba-kserve-provision-location\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.266528 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:28:05.266405 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls podName:718b0425-c740-48fb-a8a5-16f62d7b64ba nodeName:}" failed. No retries permitted until 2026-04-21 04:28:05.766383135 +0000 UTC m=+1871.430973208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls") pod "isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" (UID: "718b0425-c740-48fb-a8a5-16f62d7b64ba") : secret "isvc-init-fail-8ab562-predictor-serving-cert" not found Apr 21 04:28:05.266528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-cabundle-cert\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.266528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266494 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-cabundle-cert\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:05.266528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266510 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/873c62a2-2d37-4b57-829b-a0da752f3afe-isvc-secondary-fe3cee-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:05.266528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266524 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xgnm\" (UniqueName: \"kubernetes.io/projected/873c62a2-2d37-4b57-829b-a0da752f3afe-kube-api-access-5xgnm\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:05.266749 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266540 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/873c62a2-2d37-4b57-829b-a0da752f3afe-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:05.266749 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266555 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/873c62a2-2d37-4b57-829b-a0da752f3afe-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:05.266749 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/718b0425-c740-48fb-a8a5-16f62d7b64ba-kserve-provision-location\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.266933 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.266914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.267082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.267065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-cabundle-cert\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.274304 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.274272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6g6\" (UniqueName: \"kubernetes.io/projected/718b0425-c740-48fb-a8a5-16f62d7b64ba-kube-api-access-mj6g6\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.370346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.370311 2575 generic.go:358] "Generic (PLEG): container finished" podID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerID="a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e" exitCode=2 Apr 21 04:28:05.370517 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.370381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerDied","Data":"a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e"} Apr 21 04:28:05.371387 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.371370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk_873c62a2-2d37-4b57-829b-a0da752f3afe/storage-initializer/1.log" Apr 21 04:28:05.371498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.371450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" event={"ID":"873c62a2-2d37-4b57-829b-a0da752f3afe","Type":"ContainerDied","Data":"f5171d997a2277441d4e5173acf2ea13d00148d1a3d5ce644a0647ecc33afdb2"} Apr 21 04:28:05.371498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.371480 2575 scope.go:117] "RemoveContainer" containerID="8fbd68f9be2225c158572f5b0db0ea5688468a16229745a462476c717f5e77f7" Apr 21 04:28:05.371498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.371491 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk" Apr 21 04:28:05.409159 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.409119 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk"] Apr 21 04:28:05.413522 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.413492 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fe3cee-predictor-6f89d8576f-gchhk"] Apr 21 04:28:05.770492 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.770444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.772934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.772907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls\") pod \"isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:05.993463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:05.993425 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:06.118074 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:06.118039 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs"] Apr 21 04:28:06.121212 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:28:06.121181 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod718b0425_c740_48fb_a8a5_16f62d7b64ba.slice/crio-700ce933e5db7fafa208d5a06b67b3d8305c205f064c37c4ea47972ce7706e31 WatchSource:0}: Error finding container 700ce933e5db7fafa208d5a06b67b3d8305c205f064c37c4ea47972ce7706e31: Status 404 returned error can't find the container with id 700ce933e5db7fafa208d5a06b67b3d8305c205f064c37c4ea47972ce7706e31 Apr 21 04:28:06.376931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:06.376889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" event={"ID":"718b0425-c740-48fb-a8a5-16f62d7b64ba","Type":"ContainerStarted","Data":"c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88"} Apr 21 04:28:06.376931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:06.376931 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" event={"ID":"718b0425-c740-48fb-a8a5-16f62d7b64ba","Type":"ContainerStarted","Data":"700ce933e5db7fafa208d5a06b67b3d8305c205f064c37c4ea47972ce7706e31"} Apr 21 04:28:07.027448 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:07.027411 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" path="/var/lib/kubelet/pods/873c62a2-2d37-4b57-829b-a0da752f3afe/volumes" Apr 21 04:28:09.619330 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.619304 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:28:09.703171 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.703072 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kserve-provision-location\") pod \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " Apr 21 04:28:09.703171 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.703128 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5b99\" (UniqueName: \"kubernetes.io/projected/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kube-api-access-m5b99\") pod \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " Apr 21 04:28:09.703171 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.703147 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-proxy-tls\") pod \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " Apr 21 04:28:09.703171 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.703167 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-isvc-primary-fe3cee-kube-rbac-proxy-sar-config\") pod \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\" (UID: \"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e\") " Apr 21 04:28:09.703527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.703473 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" (UID: "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:28:09.703570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.703519 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-isvc-primary-fe3cee-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-fe3cee-kube-rbac-proxy-sar-config") pod "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" (UID: "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e"). InnerVolumeSpecName "isvc-primary-fe3cee-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:09.705304 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.705272 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" (UID: "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:09.705422 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.705304 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kube-api-access-m5b99" (OuterVolumeSpecName: "kube-api-access-m5b99") pod "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" (UID: "eeac5670-9986-4fe1-8ee5-a4bf924fbb2e"). InnerVolumeSpecName "kube-api-access-m5b99". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:28:09.804093 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.804057 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:09.804093 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.804089 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5b99\" (UniqueName: \"kubernetes.io/projected/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-kube-api-access-m5b99\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:09.804303 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.804100 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:09.804303 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:09.804114 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-fe3cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e-isvc-primary-fe3cee-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:10.389560 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.389527 2575 generic.go:358] "Generic (PLEG): container finished" podID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerID="232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d" exitCode=0 Apr 21 04:28:10.389777 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.389607 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" Apr 21 04:28:10.389777 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.389604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerDied","Data":"232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d"} Apr 21 04:28:10.389777 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.389721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc" event={"ID":"eeac5670-9986-4fe1-8ee5-a4bf924fbb2e","Type":"ContainerDied","Data":"4a772d708ae470d4b607b57d1a45c07c5b63a43a63079c109c6102558f0e062a"} Apr 21 04:28:10.389777 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.389743 2575 scope.go:117] "RemoveContainer" containerID="a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e" Apr 21 04:28:10.397667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.397648 2575 scope.go:117] "RemoveContainer" containerID="232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d" Apr 21 04:28:10.404186 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.404171 2575 scope.go:117] "RemoveContainer" containerID="b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b" Apr 21 04:28:10.409982 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.409952 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc"] Apr 21 04:28:10.411201 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.411185 2575 scope.go:117] "RemoveContainer" containerID="a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e" Apr 21 04:28:10.411453 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:28:10.411433 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e\": container with ID starting with a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e not found: ID does not exist" containerID="a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e" Apr 21 04:28:10.411506 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.411466 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e"} err="failed to get container status \"a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e\": rpc error: code = NotFound desc = could not find container \"a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e\": container with ID starting with a3620e76d8e7533af46a282f80dd325d9fcd63ea3bf228fe5a61a0d5dc58182e not found: ID does not exist" Apr 21 04:28:10.411506 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.411485 2575 scope.go:117] "RemoveContainer" containerID="232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d" Apr 21 04:28:10.411736 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:28:10.411717 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d\": container with ID starting with 232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d not found: ID does not exist" containerID="232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d" Apr 21 04:28:10.411801 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.411740 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d"} err="failed to get container status \"232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d\": rpc error: code = NotFound desc = could not find container \"232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d\": container with ID starting with 232a7104736299d6640d2fea044c48b4e406b2a4971c5e7b0265aeb49a880f7d not found: ID does not exist" Apr 21 04:28:10.411957 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.411945 2575 scope.go:117] "RemoveContainer" containerID="b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b" Apr 21 04:28:10.412174 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:28:10.412157 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b\": container with ID starting with b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b not found: ID does not exist" containerID="b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b" Apr 21 04:28:10.412241 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.412177 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b"} err="failed to get container status \"b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b\": rpc error: code = NotFound desc = could not find container \"b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b\": container with ID starting with b1fe58233828663e9dabead1ced2c7c2742891acf95b062d6cd358f1c190578b not found: ID does not exist" Apr 21 04:28:10.414634 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:10.414614 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fe3cee-predictor-68445845fd-796xc"] Apr 21 04:28:11.027931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:11.027898 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" path="/var/lib/kubelet/pods/eeac5670-9986-4fe1-8ee5-a4bf924fbb2e/volumes" Apr 21 04:28:12.402860 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:12.402829 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs_718b0425-c740-48fb-a8a5-16f62d7b64ba/storage-initializer/0.log" Apr 21 04:28:12.403246 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:12.402871 2575 generic.go:358] "Generic (PLEG): container finished" podID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerID="c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88" exitCode=1 Apr 21 04:28:12.403246 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:12.402920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" event={"ID":"718b0425-c740-48fb-a8a5-16f62d7b64ba","Type":"ContainerDied","Data":"c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88"} Apr 21 04:28:13.407059 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:13.407030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs_718b0425-c740-48fb-a8a5-16f62d7b64ba/storage-initializer/0.log" Apr 21 04:28:13.407437 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:13.407119 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" event={"ID":"718b0425-c740-48fb-a8a5-16f62d7b64ba","Type":"ContainerStarted","Data":"20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b"} Apr 21 04:28:15.075267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.075185 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs"] Apr 21 04:28:15.075640 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.075467 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerName="storage-initializer" containerID="cri-o://20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b" gracePeriod=30 Apr 21 04:28:15.251401 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251372 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw"] Apr 21 04:28:15.251748 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251729 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kube-rbac-proxy" Apr 21 04:28:15.251748 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251746 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kube-rbac-proxy" Apr 21 04:28:15.251748 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251772 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerName="storage-initializer" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251782 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerName="storage-initializer" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251809 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251817 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251835 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="storage-initializer" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251841 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="storage-initializer" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251897 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerName="storage-initializer" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251908 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kube-rbac-proxy" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251916 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerName="storage-initializer" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251922 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="eeac5670-9986-4fe1-8ee5-a4bf924fbb2e" containerName="kserve-container" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251968 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerName="storage-initializer" Apr 21 04:28:15.252006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.251975 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="873c62a2-2d37-4b57-829b-a0da752f3afe" containerName="storage-initializer" Apr 21 04:28:15.255199 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.255176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.257670 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.257648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 21 04:28:15.257802 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.257698 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wxnxz\"" Apr 21 04:28:15.257802 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.257740 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 21 04:28:15.263286 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.263004 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw"] Apr 21 04:28:15.348992 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.348897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.348992 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.348941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmg2b\" (UniqueName: \"kubernetes.io/projected/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kube-api-access-fmg2b\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.348992 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.348981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.349231 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.349066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.449593 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.449561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.449787 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.449711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.449787 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.449776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.449925 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.449806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmg2b\" (UniqueName: \"kubernetes.io/projected/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kube-api-access-fmg2b\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.450156 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.450129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.450382 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.450362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.452187 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.452167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.457773 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.457734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmg2b\" (UniqueName: \"kubernetes.io/projected/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kube-api-access-fmg2b\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.567803 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.567740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:15.693376 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.693332 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw"] Apr 21 04:28:15.697633 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:28:15.697602 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba3bb26_284c_4ca1_89f4_d7c405f7c28c.slice/crio-0c9bf32660e7eaf995497c093ce66ba5a742ee9a4b1dfb88added5fc3c942b10 WatchSource:0}: Error finding container 0c9bf32660e7eaf995497c093ce66ba5a742ee9a4b1dfb88added5fc3c942b10: Status 404 returned error can't find the container with id 0c9bf32660e7eaf995497c093ce66ba5a742ee9a4b1dfb88added5fc3c942b10 Apr 21 04:28:15.896483 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.896460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs_718b0425-c740-48fb-a8a5-16f62d7b64ba/storage-initializer/1.log" Apr 21 04:28:15.896953 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.896933 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs_718b0425-c740-48fb-a8a5-16f62d7b64ba/storage-initializer/0.log" Apr 21 04:28:15.897037 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.897023 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:15.954990 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.954957 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls\") pod \"718b0425-c740-48fb-a8a5-16f62d7b64ba\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " Apr 21 04:28:15.955157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955000 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/718b0425-c740-48fb-a8a5-16f62d7b64ba-kserve-provision-location\") pod \"718b0425-c740-48fb-a8a5-16f62d7b64ba\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " Apr 21 04:28:15.955157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955025 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\") pod \"718b0425-c740-48fb-a8a5-16f62d7b64ba\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " Apr 21 04:28:15.955157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955081 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj6g6\" (UniqueName: \"kubernetes.io/projected/718b0425-c740-48fb-a8a5-16f62d7b64ba-kube-api-access-mj6g6\") pod \"718b0425-c740-48fb-a8a5-16f62d7b64ba\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " Apr 21 04:28:15.955157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955128 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-cabundle-cert\") pod \"718b0425-c740-48fb-a8a5-16f62d7b64ba\" (UID: \"718b0425-c740-48fb-a8a5-16f62d7b64ba\") " Apr 21 04:28:15.955381 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955294 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718b0425-c740-48fb-a8a5-16f62d7b64ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "718b0425-c740-48fb-a8a5-16f62d7b64ba" (UID: "718b0425-c740-48fb-a8a5-16f62d7b64ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:28:15.955435 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955411 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/718b0425-c740-48fb-a8a5-16f62d7b64ba-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:15.955540 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955512 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-isvc-init-fail-8ab562-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-8ab562-kube-rbac-proxy-sar-config") pod "718b0425-c740-48fb-a8a5-16f62d7b64ba" (UID: "718b0425-c740-48fb-a8a5-16f62d7b64ba"). InnerVolumeSpecName "isvc-init-fail-8ab562-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:15.955612 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.955595 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "718b0425-c740-48fb-a8a5-16f62d7b64ba" (UID: "718b0425-c740-48fb-a8a5-16f62d7b64ba"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:15.957196 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.957172 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "718b0425-c740-48fb-a8a5-16f62d7b64ba" (UID: "718b0425-c740-48fb-a8a5-16f62d7b64ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:15.957291 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:15.957200 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718b0425-c740-48fb-a8a5-16f62d7b64ba-kube-api-access-mj6g6" (OuterVolumeSpecName: "kube-api-access-mj6g6") pod "718b0425-c740-48fb-a8a5-16f62d7b64ba" (UID: "718b0425-c740-48fb-a8a5-16f62d7b64ba"). InnerVolumeSpecName "kube-api-access-mj6g6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:28:16.056300 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.056260 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-cabundle-cert\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:16.056300 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.056300 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/718b0425-c740-48fb-a8a5-16f62d7b64ba-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:16.056482 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.056316 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/718b0425-c740-48fb-a8a5-16f62d7b64ba-isvc-init-fail-8ab562-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:16.056482 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.056334 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mj6g6\" (UniqueName: \"kubernetes.io/projected/718b0425-c740-48fb-a8a5-16f62d7b64ba-kube-api-access-mj6g6\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:28:16.416065 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.416038 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs_718b0425-c740-48fb-a8a5-16f62d7b64ba/storage-initializer/1.log" Apr 21 04:28:16.416536 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.416414 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs_718b0425-c740-48fb-a8a5-16f62d7b64ba/storage-initializer/0.log" Apr 21 04:28:16.416536 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.416454 2575 generic.go:358] "Generic (PLEG): container finished" podID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerID="20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b" exitCode=1 Apr 21 04:28:16.416536 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.416533 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" Apr 21 04:28:16.416694 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.416530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" event={"ID":"718b0425-c740-48fb-a8a5-16f62d7b64ba","Type":"ContainerDied","Data":"20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b"} Apr 21 04:28:16.416694 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.416647 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs" event={"ID":"718b0425-c740-48fb-a8a5-16f62d7b64ba","Type":"ContainerDied","Data":"700ce933e5db7fafa208d5a06b67b3d8305c205f064c37c4ea47972ce7706e31"} Apr 21 04:28:16.416694 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.416673 2575 scope.go:117] "RemoveContainer" containerID="20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b" Apr 21 04:28:16.418166 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.418144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerStarted","Data":"5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d"} Apr 21 04:28:16.418261 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.418171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerStarted","Data":"0c9bf32660e7eaf995497c093ce66ba5a742ee9a4b1dfb88added5fc3c942b10"} Apr 21 04:28:16.425095 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.425077 2575 scope.go:117] "RemoveContainer" containerID="c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88" Apr 21 04:28:16.432519 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.432499 2575 scope.go:117] "RemoveContainer" containerID="20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b" Apr 21 04:28:16.432819 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:28:16.432796 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b\": container with ID starting with 20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b not found: ID does not exist" containerID="20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b" Apr 21 04:28:16.432899 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.432827 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b"} err="failed to get container status \"20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b\": rpc error: code = NotFound desc = could not find container \"20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b\": container with ID starting with 20138c67b50c7b6d7a293280cb595930cd9c3ada669f05415fcfef47694ed29b not found: ID does not exist" Apr 21 04:28:16.432899 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.432847 2575 scope.go:117] "RemoveContainer" containerID="c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88" Apr 21 04:28:16.433088 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:28:16.433068 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88\": container with ID starting with c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88 not found: ID does not exist" containerID="c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88" Apr 21 04:28:16.433127 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.433094 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88"} err="failed to get container status \"c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88\": rpc error: code = NotFound desc = could not find container \"c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88\": container with ID starting with c6c9df812b81290cf0b300c16cf56d97fe82e3208d71805cec05745aed9e1a88 not found: ID does not exist" Apr 21 04:28:16.464738 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.464709 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs"] Apr 21 04:28:16.469125 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:16.469096 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8ab562-predictor-649c5ff779-lq6gs"] Apr 21 04:28:17.027704 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:17.027667 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" path="/var/lib/kubelet/pods/718b0425-c740-48fb-a8a5-16f62d7b64ba/volumes" Apr 21 04:28:20.429861 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:20.429827 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerID="5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d" exitCode=0 Apr 21 04:28:20.430226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:20.429901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerDied","Data":"5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d"} Apr 21 04:28:44.509348 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:44.509312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerStarted","Data":"7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed"} Apr 21 04:28:44.509838 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:44.509357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerStarted","Data":"bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989"} Apr 21 04:28:44.509838 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:44.509687 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:44.509838 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:44.509818 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:44.511012 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:44.510985 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:28:44.528913 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:44.528871 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podStartSLOduration=6.540591723 podStartE2EDuration="29.52885921s" podCreationTimestamp="2026-04-21 04:28:15 +0000 UTC" firstStartedPulling="2026-04-21 04:28:20.431076951 +0000 UTC m=+1886.095667015" lastFinishedPulling="2026-04-21 04:28:43.419344438 +0000 UTC m=+1909.083934502" observedRunningTime="2026-04-21 04:28:44.527457716 +0000 UTC m=+1910.192047803" watchObservedRunningTime="2026-04-21 04:28:44.52885921 +0000 UTC m=+1910.193449295" Apr 21 04:28:45.512772 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:45.512553 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:28:50.516473 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:50.516446 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:28:50.516975 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:28:50.516950 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:29:00.517222 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:29:00.517183 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:29:10.517685 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:29:10.517641 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:29:20.517615 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:29:20.517572 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:29:30.517459 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:29:30.517415 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:29:40.517693 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:29:40.517648 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:29:50.517434 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:29:50.517348 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 21 04:30:00.517906 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:00.517870 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:30:05.364213 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.364184 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw"] Apr 21 04:30:05.364599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.364449 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" containerID="cri-o://bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989" gracePeriod=30 Apr 21 04:30:05.364599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.364498 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kube-rbac-proxy" containerID="cri-o://7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed" gracePeriod=30 Apr 21 04:30:05.466387 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.466358 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx"] Apr 21 04:30:05.466655 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.466642 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerName="storage-initializer" Apr 21 04:30:05.466700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.466659 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerName="storage-initializer" Apr 21 04:30:05.466700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.466669 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerName="storage-initializer" Apr 21 04:30:05.466700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.466675 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerName="storage-initializer" Apr 21 04:30:05.466831 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.466727 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerName="storage-initializer" Apr 21 04:30:05.466831 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.466739 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="718b0425-c740-48fb-a8a5-16f62d7b64ba" containerName="storage-initializer" Apr 21 04:30:05.473096 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.473068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.475752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.475729 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 21 04:30:05.475917 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.475779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 21 04:30:05.477858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.477834 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx"] Apr 21 04:30:05.512471 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.512446 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 21 04:30:05.604577 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.604552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e90985e-66aa-4568-9400-dda8e229ed2f-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.604691 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.604600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e90985e-66aa-4568-9400-dda8e229ed2f-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.604691 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.604668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6mvk\" (UniqueName: \"kubernetes.io/projected/7e90985e-66aa-4568-9400-dda8e229ed2f-kube-api-access-x6mvk\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.604785 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.604744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e90985e-66aa-4568-9400-dda8e229ed2f-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.705312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.705283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e90985e-66aa-4568-9400-dda8e229ed2f-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.705469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.705338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e90985e-66aa-4568-9400-dda8e229ed2f-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.705469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.705376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6mvk\" (UniqueName: \"kubernetes.io/projected/7e90985e-66aa-4568-9400-dda8e229ed2f-kube-api-access-x6mvk\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.705469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.705433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e90985e-66aa-4568-9400-dda8e229ed2f-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.705825 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.705805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e90985e-66aa-4568-9400-dda8e229ed2f-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.706101 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.706083 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e90985e-66aa-4568-9400-dda8e229ed2f-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.707733 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.707713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e90985e-66aa-4568-9400-dda8e229ed2f-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.713778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.713745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6mvk\" (UniqueName: \"kubernetes.io/projected/7e90985e-66aa-4568-9400-dda8e229ed2f-kube-api-access-x6mvk\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.737007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.736982 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerID="7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed" exitCode=2 Apr 21 04:30:05.737107 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.737046 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerDied","Data":"7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed"} Apr 21 04:30:05.784413 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.784391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:05.902870 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:05.902829 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx"] Apr 21 04:30:05.905549 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:30:05.905526 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e90985e_66aa_4568_9400_dda8e229ed2f.slice/crio-078c42b6ec24ebfd6dd2b09cede75e23f3fa2790b78b39bdb656c199fbc86f38 WatchSource:0}: Error finding container 078c42b6ec24ebfd6dd2b09cede75e23f3fa2790b78b39bdb656c199fbc86f38: Status 404 returned error can't find the container with id 078c42b6ec24ebfd6dd2b09cede75e23f3fa2790b78b39bdb656c199fbc86f38 Apr 21 04:30:06.740697 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:06.740661 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerStarted","Data":"788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0"} Apr 21 04:30:06.740697 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:06.740700 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerStarted","Data":"078c42b6ec24ebfd6dd2b09cede75e23f3fa2790b78b39bdb656c199fbc86f38"} Apr 21 04:30:09.901385 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:09.901361 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:30:09.937419 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:09.937387 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kserve-provision-location\") pod \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " Apr 21 04:30:09.937590 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:09.937429 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " Apr 21 04:30:09.937590 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:09.937473 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmg2b\" (UniqueName: \"kubernetes.io/projected/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kube-api-access-fmg2b\") pod \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " Apr 21 04:30:09.937773 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:09.937736 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" (UID: "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:30:09.937859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:09.937775 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" (UID: "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:30:09.939552 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:09.939525 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kube-api-access-fmg2b" (OuterVolumeSpecName: "kube-api-access-fmg2b") pod "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" (UID: "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c"). InnerVolumeSpecName "kube-api-access-fmg2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:30:10.038588 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.038506 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-proxy-tls\") pod \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\" (UID: \"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c\") " Apr 21 04:30:10.038729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.038632 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:30:10.038729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.038644 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:30:10.038729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.038655 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmg2b\" (UniqueName: \"kubernetes.io/projected/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-kube-api-access-fmg2b\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:30:10.040539 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.040511 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" (UID: "4ba3bb26-284c-4ca1-89f4-d7c405f7c28c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:30:10.139068 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.139022 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:30:10.754306 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.754274 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerID="bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989" exitCode=0 Apr 21 04:30:10.754447 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.754353 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" Apr 21 04:30:10.754447 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.754354 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerDied","Data":"bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989"} Apr 21 04:30:10.754447 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.754388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw" event={"ID":"4ba3bb26-284c-4ca1-89f4-d7c405f7c28c","Type":"ContainerDied","Data":"0c9bf32660e7eaf995497c093ce66ba5a742ee9a4b1dfb88added5fc3c942b10"} Apr 21 04:30:10.754447 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.754405 2575 scope.go:117] "RemoveContainer" containerID="7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed" Apr 21 04:30:10.755700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.755679 2575 generic.go:358] "Generic (PLEG): container finished" podID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerID="788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0" exitCode=0 Apr 21 04:30:10.755831 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.755707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerDied","Data":"788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0"} Apr 21 04:30:10.763201 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.762732 2575 scope.go:117] "RemoveContainer" containerID="bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989" Apr 21 04:30:10.769810 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.769788 2575 scope.go:117] "RemoveContainer" containerID="5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d" Apr 21 04:30:10.782145 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.782006 2575 scope.go:117] "RemoveContainer" containerID="7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed" Apr 21 04:30:10.782335 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:30:10.782301 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed\": container with ID starting with 7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed not found: ID does not exist" containerID="7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed" Apr 21 04:30:10.782389 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.782343 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed"} err="failed to get container status \"7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed\": rpc error: code = NotFound desc = could not find container \"7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed\": container with ID starting with 7bbd76e43bbe4f5bf596ededc9dd662630bea028252e746c2a03a459d9f983ed not found: ID does not exist" Apr 21 04:30:10.782389 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.782363 2575 scope.go:117] "RemoveContainer" containerID="bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989" Apr 21 04:30:10.782579 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:30:10.782566 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989\": container with ID starting with bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989 not found: ID does not exist" containerID="bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989" Apr 21 04:30:10.782616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.782582 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989"} err="failed to get container status \"bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989\": rpc error: code = NotFound desc = could not find container \"bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989\": container with ID starting with bee981ebf0f0a7454eca1baad7bc7bc730a5ca2ebba22bdd743a4640e1aaa989 not found: ID does not exist" Apr 21 04:30:10.782616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.782596 2575 scope.go:117] "RemoveContainer" containerID="5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d" Apr 21 04:30:10.782879 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:30:10.782861 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d\": container with ID starting with 5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d not found: ID does not exist" containerID="5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d" Apr 21 04:30:10.782941 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.782884 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d"} err="failed to get container status \"5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d\": rpc error: code = NotFound desc = could not find container \"5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d\": container with ID starting with 5ca656eb63b0fe71a4f097915c1ec96944619137e2f920de3aac6907074f1d4d not found: ID does not exist" Apr 21 04:30:10.787512 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.787475 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw"] Apr 21 04:30:10.788854 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:10.788833 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-4qxbw"] Apr 21 04:30:11.028021 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:11.027980 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" path="/var/lib/kubelet/pods/4ba3bb26-284c-4ca1-89f4-d7c405f7c28c/volumes" Apr 21 04:30:11.760934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:11.760891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerStarted","Data":"6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f"} Apr 21 04:30:11.760934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:11.760930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerStarted","Data":"7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe"} Apr 21 04:30:11.761283 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:11.761250 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:11.761283 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:11.761284 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:11.762772 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:11.762728 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:30:11.779128 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:11.779080 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podStartSLOduration=6.779067256 podStartE2EDuration="6.779067256s" podCreationTimestamp="2026-04-21 04:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:30:11.777954076 +0000 UTC m=+1997.442544161" watchObservedRunningTime="2026-04-21 04:30:11.779067256 +0000 UTC m=+1997.443657367" Apr 21 04:30:12.765172 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:12.765132 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:30:17.769852 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:17.769820 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:30:17.770334 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:17.770307 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:30:27.771341 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:27.771300 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:30:37.770488 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:37.770444 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:30:47.771208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:47.771167 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:30:57.770975 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:30:57.770936 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:31:07.771163 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:07.771122 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:31:17.771361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:17.771272 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:31:27.770924 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:27.770887 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:31:35.573336 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.573302 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx"] Apr 21 04:31:35.573841 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.573635 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" containerID="cri-o://7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe" gracePeriod=30 Apr 21 04:31:35.573841 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.573690 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kube-rbac-proxy" containerID="cri-o://6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f" gracePeriod=30 Apr 21 04:31:35.679323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679276 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt"] Apr 21 04:31:35.679663 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679645 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" Apr 21 04:31:35.679663 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679663 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" Apr 21 04:31:35.679866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679678 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="storage-initializer" Apr 21 04:31:35.679866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679683 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="storage-initializer" Apr 21 04:31:35.679866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679694 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kube-rbac-proxy" Apr 21 04:31:35.679866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679703 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kube-rbac-proxy" Apr 21 04:31:35.679866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679783 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kube-rbac-proxy" Apr 21 04:31:35.679866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.679793 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ba3bb26-284c-4ca1-89f4-d7c405f7c28c" containerName="kserve-container" Apr 21 04:31:35.682815 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.682791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.685243 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.685209 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 21 04:31:35.685373 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.685355 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 21 04:31:35.693495 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.693470 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt"] Apr 21 04:31:35.774771 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.774722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59fdec78-a20e-4234-832b-b10aa390e635-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.774940 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.774824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59fdec78-a20e-4234-832b-b10aa390e635-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.774940 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.774846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59fdec78-a20e-4234-832b-b10aa390e635-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.774940 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.774867 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjps\" (UniqueName: \"kubernetes.io/projected/59fdec78-a20e-4234-832b-b10aa390e635-kube-api-access-wqjps\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.875633 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.875586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59fdec78-a20e-4234-832b-b10aa390e635-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.875633 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.875634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59fdec78-a20e-4234-832b-b10aa390e635-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.875946 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.875667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjps\" (UniqueName: \"kubernetes.io/projected/59fdec78-a20e-4234-832b-b10aa390e635-kube-api-access-wqjps\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.875946 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.875719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59fdec78-a20e-4234-832b-b10aa390e635-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.876141 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.876120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59fdec78-a20e-4234-832b-b10aa390e635-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.876425 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.876401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59fdec78-a20e-4234-832b-b10aa390e635-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.878264 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.878244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59fdec78-a20e-4234-832b-b10aa390e635-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.883568 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.883544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjps\" (UniqueName: \"kubernetes.io/projected/59fdec78-a20e-4234-832b-b10aa390e635-kube-api-access-wqjps\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:35.995139 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:35.995091 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:36.009451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:36.009295 2575 generic.go:358] "Generic (PLEG): container finished" podID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerID="6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f" exitCode=2 Apr 21 04:31:36.009451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:36.009360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerDied","Data":"6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f"} Apr 21 04:31:36.123599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:36.123573 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt"] Apr 21 04:31:36.126428 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:31:36.126366 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59fdec78_a20e_4234_832b_b10aa390e635.slice/crio-46d7b87c902c42c6e87dbf2b973a669f031ce8c9acbf658257564ec803c1c96e WatchSource:0}: Error finding container 46d7b87c902c42c6e87dbf2b973a669f031ce8c9acbf658257564ec803c1c96e: Status 404 returned error can't find the container with id 46d7b87c902c42c6e87dbf2b973a669f031ce8c9acbf658257564ec803c1c96e Apr 21 04:31:36.128172 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:36.128155 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:31:37.013567 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:37.013531 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerStarted","Data":"46e3ee5ab32e9c5ef6eeb0616b4c7f06949c13facb887489c22e5e605b097d85"} Apr 21 04:31:37.013567 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:37.013568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerStarted","Data":"46d7b87c902c42c6e87dbf2b973a669f031ce8c9acbf658257564ec803c1c96e"} Apr 21 04:31:37.766643 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:37.766583 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 21 04:31:37.771173 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:37.771128 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 21 04:31:41.016185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.016159 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:31:41.031258 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.031209 2575 generic.go:358] "Generic (PLEG): container finished" podID="59fdec78-a20e-4234-832b-b10aa390e635" containerID="46e3ee5ab32e9c5ef6eeb0616b4c7f06949c13facb887489c22e5e605b097d85" exitCode=0 Apr 21 04:31:41.033396 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.033362 2575 generic.go:358] "Generic (PLEG): container finished" podID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerID="7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe" exitCode=0 Apr 21 04:31:41.033566 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.033526 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" Apr 21 04:31:41.034173 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.034141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerDied","Data":"46e3ee5ab32e9c5ef6eeb0616b4c7f06949c13facb887489c22e5e605b097d85"} Apr 21 04:31:41.034441 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.034408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerDied","Data":"7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe"} Apr 21 04:31:41.034441 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.034451 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx" event={"ID":"7e90985e-66aa-4568-9400-dda8e229ed2f","Type":"ContainerDied","Data":"078c42b6ec24ebfd6dd2b09cede75e23f3fa2790b78b39bdb656c199fbc86f38"} Apr 21 04:31:41.034745 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.034476 2575 scope.go:117] "RemoveContainer" containerID="6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f" Apr 21 04:31:41.045598 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.045573 2575 scope.go:117] "RemoveContainer" containerID="7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe" Apr 21 04:31:41.054825 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.054799 2575 scope.go:117] "RemoveContainer" containerID="788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0" Apr 21 04:31:41.066283 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.066256 2575 scope.go:117] "RemoveContainer" containerID="6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f" Apr 21 04:31:41.066556 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:31:41.066535 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f\": container with ID starting with 6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f not found: ID does not exist" containerID="6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f" Apr 21 04:31:41.066625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.066565 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f"} err="failed to get container status \"6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f\": rpc error: code = NotFound desc = could not find container \"6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f\": container with ID starting with 6eb039fe27415a17f05e8f096d02ce63949700c01e6ae31d0c141528791be93f not found: ID does not exist" Apr 21 04:31:41.066625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.066589 2575 scope.go:117] "RemoveContainer" containerID="7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe" Apr 21 04:31:41.066906 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:31:41.066887 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe\": container with ID starting with 7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe not found: ID does not exist" containerID="7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe" Apr 21 04:31:41.066955 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.066912 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe"} err="failed to get container status \"7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe\": rpc error: code = NotFound desc = could not find container \"7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe\": container with ID starting with 7f780dac4c1146264c6c63838d48da0e8f3ec6e34b1c2ceb7b62a4814797aebe not found: ID does not exist" Apr 21 04:31:41.066955 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.066929 2575 scope.go:117] "RemoveContainer" containerID="788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0" Apr 21 04:31:41.067161 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:31:41.067144 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0\": container with ID starting with 788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0 not found: ID does not exist" containerID="788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0" Apr 21 04:31:41.067208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.067164 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0"} err="failed to get container status \"788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0\": rpc error: code = NotFound desc = could not find container \"788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0\": container with ID starting with 788fd314fd19210d1c12a54bd36f66136de4a2ae32ff9bb498e2143ab12914c0 not found: ID does not exist" Apr 21 04:31:41.110288 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.110266 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e90985e-66aa-4568-9400-dda8e229ed2f-kserve-provision-location\") pod \"7e90985e-66aa-4568-9400-dda8e229ed2f\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " Apr 21 04:31:41.110370 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.110319 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6mvk\" (UniqueName: \"kubernetes.io/projected/7e90985e-66aa-4568-9400-dda8e229ed2f-kube-api-access-x6mvk\") pod \"7e90985e-66aa-4568-9400-dda8e229ed2f\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " Apr 21 04:31:41.110370 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.110347 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e90985e-66aa-4568-9400-dda8e229ed2f-proxy-tls\") pod \"7e90985e-66aa-4568-9400-dda8e229ed2f\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " Apr 21 04:31:41.110476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.110378 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e90985e-66aa-4568-9400-dda8e229ed2f-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"7e90985e-66aa-4568-9400-dda8e229ed2f\" (UID: \"7e90985e-66aa-4568-9400-dda8e229ed2f\") " Apr 21 04:31:41.110589 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.110566 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e90985e-66aa-4568-9400-dda8e229ed2f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e90985e-66aa-4568-9400-dda8e229ed2f" (UID: "7e90985e-66aa-4568-9400-dda8e229ed2f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:31:41.111246 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.110873 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e90985e-66aa-4568-9400-dda8e229ed2f-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:31:41.111482 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.111456 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e90985e-66aa-4568-9400-dda8e229ed2f-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "7e90985e-66aa-4568-9400-dda8e229ed2f" (UID: "7e90985e-66aa-4568-9400-dda8e229ed2f"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:31:41.112569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.112543 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e90985e-66aa-4568-9400-dda8e229ed2f-kube-api-access-x6mvk" (OuterVolumeSpecName: "kube-api-access-x6mvk") pod "7e90985e-66aa-4568-9400-dda8e229ed2f" (UID: "7e90985e-66aa-4568-9400-dda8e229ed2f"). InnerVolumeSpecName "kube-api-access-x6mvk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:31:41.112629 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.112597 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e90985e-66aa-4568-9400-dda8e229ed2f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7e90985e-66aa-4568-9400-dda8e229ed2f" (UID: "7e90985e-66aa-4568-9400-dda8e229ed2f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:31:41.211468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.211434 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6mvk\" (UniqueName: \"kubernetes.io/projected/7e90985e-66aa-4568-9400-dda8e229ed2f-kube-api-access-x6mvk\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:31:41.211468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.211463 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e90985e-66aa-4568-9400-dda8e229ed2f-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:31:41.211468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.211474 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e90985e-66aa-4568-9400-dda8e229ed2f-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:31:41.356957 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.356927 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx"] Apr 21 04:31:41.360650 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:41.360618 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-vmpxx"] Apr 21 04:31:42.038647 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:42.038611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerStarted","Data":"fbb876135871360648d09c08152301c2f1c1fc166c23026bb82506816da57e94"} Apr 21 04:31:42.038647 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:42.038646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerStarted","Data":"cf0ab84095956d5d441ec6cc1772646d259d85f59be63fc70a045a8ba529dd3f"} Apr 21 04:31:42.039164 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:42.038946 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:42.039164 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:42.039074 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:42.040317 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:42.040290 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:31:42.058813 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:42.058738 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podStartSLOduration=7.058721263 podStartE2EDuration="7.058721263s" podCreationTimestamp="2026-04-21 04:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:31:42.056581867 +0000 UTC m=+2087.721171952" watchObservedRunningTime="2026-04-21 04:31:42.058721263 +0000 UTC m=+2087.723311350" Apr 21 04:31:43.027061 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:43.027028 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" path="/var/lib/kubelet/pods/7e90985e-66aa-4568-9400-dda8e229ed2f/volumes" Apr 21 04:31:43.041585 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:43.041555 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:31:48.046197 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:48.046166 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:31:48.046818 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:48.046788 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:31:58.047554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:31:58.047512 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:32:08.047576 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:32:08.047532 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:32:18.046871 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:32:18.046830 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:32:28.047587 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:32:28.047545 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:32:38.046950 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:32:38.046907 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:32:48.047433 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:32:48.047349 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:32:58.047461 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:32:58.047433 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:33:05.802661 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.802632 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt"] Apr 21 04:33:05.803085 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.802976 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" containerID="cri-o://cf0ab84095956d5d441ec6cc1772646d259d85f59be63fc70a045a8ba529dd3f" gracePeriod=30 Apr 21 04:33:05.803085 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.803017 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kube-rbac-proxy" containerID="cri-o://fbb876135871360648d09c08152301c2f1c1fc166c23026bb82506816da57e94" gracePeriod=30 Apr 21 04:33:05.919123 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919096 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd"] Apr 21 04:33:05.919510 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919496 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="storage-initializer" Apr 21 04:33:05.919553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919514 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="storage-initializer" Apr 21 04:33:05.919553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919530 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kube-rbac-proxy" Apr 21 04:33:05.919553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919540 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kube-rbac-proxy" Apr 21 04:33:05.919652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919557 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" Apr 21 04:33:05.919652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919565 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" Apr 21 04:33:05.919652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919629 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kube-rbac-proxy" Apr 21 04:33:05.919652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.919642 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e90985e-66aa-4568-9400-dda8e229ed2f" containerName="kserve-container" Apr 21 04:33:05.922834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.922814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:05.925436 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.925417 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 21 04:33:05.925541 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.925466 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 21 04:33:05.933153 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:05.933134 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd"] Apr 21 04:33:06.031537 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.031503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bf581ad-49f7-44c9-a615-58c0a766c136-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.031700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.031559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bf581ad-49f7-44c9-a615-58c0a766c136-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.031700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.031604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb5w8\" (UniqueName: \"kubernetes.io/projected/8bf581ad-49f7-44c9-a615-58c0a766c136-kube-api-access-kb5w8\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.031700 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.031628 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bf581ad-49f7-44c9-a615-58c0a766c136-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.132778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.132734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bf581ad-49f7-44c9-a615-58c0a766c136-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.132951 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.132785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb5w8\" (UniqueName: \"kubernetes.io/projected/8bf581ad-49f7-44c9-a615-58c0a766c136-kube-api-access-kb5w8\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.132951 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.132915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bf581ad-49f7-44c9-a615-58c0a766c136-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.133088 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.132981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bf581ad-49f7-44c9-a615-58c0a766c136-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.133179 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.133155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bf581ad-49f7-44c9-a615-58c0a766c136-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.133531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.133509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bf581ad-49f7-44c9-a615-58c0a766c136-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.135306 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.135284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bf581ad-49f7-44c9-a615-58c0a766c136-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.141463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.141436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb5w8\" (UniqueName: \"kubernetes.io/projected/8bf581ad-49f7-44c9-a615-58c0a766c136-kube-api-access-kb5w8\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.233302 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.233275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:06.271385 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.271356 2575 generic.go:358] "Generic (PLEG): container finished" podID="59fdec78-a20e-4234-832b-b10aa390e635" containerID="fbb876135871360648d09c08152301c2f1c1fc166c23026bb82506816da57e94" exitCode=2 Apr 21 04:33:06.271518 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.271433 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerDied","Data":"fbb876135871360648d09c08152301c2f1c1fc166c23026bb82506816da57e94"} Apr 21 04:33:06.350733 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:06.350701 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd"] Apr 21 04:33:06.353422 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:33:06.353392 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bf581ad_49f7_44c9_a615_58c0a766c136.slice/crio-8c992fdf559fb478b470be77f044bbf67fa778888aafbf3f64941966f245d966 WatchSource:0}: Error finding container 8c992fdf559fb478b470be77f044bbf67fa778888aafbf3f64941966f245d966: Status 404 returned error can't find the container with id 8c992fdf559fb478b470be77f044bbf67fa778888aafbf3f64941966f245d966 Apr 21 04:33:07.275335 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:07.275294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerStarted","Data":"e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d"} Apr 21 04:33:07.275335 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:07.275333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerStarted","Data":"8c992fdf559fb478b470be77f044bbf67fa778888aafbf3f64941966f245d966"} Apr 21 04:33:08.042264 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:08.042222 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 21 04:33:08.047575 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:08.047552 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 21 04:33:10.286390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.286361 2575 generic.go:358] "Generic (PLEG): container finished" podID="59fdec78-a20e-4234-832b-b10aa390e635" containerID="cf0ab84095956d5d441ec6cc1772646d259d85f59be63fc70a045a8ba529dd3f" exitCode=0 Apr 21 04:33:10.286714 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.286434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerDied","Data":"cf0ab84095956d5d441ec6cc1772646d259d85f59be63fc70a045a8ba529dd3f"} Apr 21 04:33:10.287672 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.287654 2575 generic.go:358] "Generic (PLEG): container finished" podID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerID="e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d" exitCode=0 Apr 21 04:33:10.287788 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.287689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerDied","Data":"e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d"} Apr 21 04:33:10.542104 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.542077 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:33:10.670195 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.670165 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59fdec78-a20e-4234-832b-b10aa390e635-proxy-tls\") pod \"59fdec78-a20e-4234-832b-b10aa390e635\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " Apr 21 04:33:10.670346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.670204 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59fdec78-a20e-4234-832b-b10aa390e635-kserve-provision-location\") pod \"59fdec78-a20e-4234-832b-b10aa390e635\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " Apr 21 04:33:10.670346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.670240 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqjps\" (UniqueName: \"kubernetes.io/projected/59fdec78-a20e-4234-832b-b10aa390e635-kube-api-access-wqjps\") pod \"59fdec78-a20e-4234-832b-b10aa390e635\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " Apr 21 04:33:10.670346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.670269 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59fdec78-a20e-4234-832b-b10aa390e635-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"59fdec78-a20e-4234-832b-b10aa390e635\" (UID: \"59fdec78-a20e-4234-832b-b10aa390e635\") " Apr 21 04:33:10.670661 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.670618 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59fdec78-a20e-4234-832b-b10aa390e635-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59fdec78-a20e-4234-832b-b10aa390e635" (UID: "59fdec78-a20e-4234-832b-b10aa390e635"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:33:10.670661 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.670631 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fdec78-a20e-4234-832b-b10aa390e635-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "59fdec78-a20e-4234-832b-b10aa390e635" (UID: "59fdec78-a20e-4234-832b-b10aa390e635"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:33:10.672138 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.672118 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59fdec78-a20e-4234-832b-b10aa390e635-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59fdec78-a20e-4234-832b-b10aa390e635" (UID: "59fdec78-a20e-4234-832b-b10aa390e635"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:33:10.672233 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.672214 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59fdec78-a20e-4234-832b-b10aa390e635-kube-api-access-wqjps" (OuterVolumeSpecName: "kube-api-access-wqjps") pod "59fdec78-a20e-4234-832b-b10aa390e635" (UID: "59fdec78-a20e-4234-832b-b10aa390e635"). InnerVolumeSpecName "kube-api-access-wqjps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:33:10.771431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.771397 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59fdec78-a20e-4234-832b-b10aa390e635-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:33:10.771431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.771425 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59fdec78-a20e-4234-832b-b10aa390e635-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:33:10.771612 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.771438 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqjps\" (UniqueName: \"kubernetes.io/projected/59fdec78-a20e-4234-832b-b10aa390e635-kube-api-access-wqjps\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:33:10.771612 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:10.771453 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59fdec78-a20e-4234-832b-b10aa390e635-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:33:11.293007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.292914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerStarted","Data":"83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003"} Apr 21 04:33:11.293007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.292963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerStarted","Data":"48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c"} Apr 21 04:33:11.293446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.293184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:11.294451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.294431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" event={"ID":"59fdec78-a20e-4234-832b-b10aa390e635","Type":"ContainerDied","Data":"46d7b87c902c42c6e87dbf2b973a669f031ce8c9acbf658257564ec803c1c96e"} Apr 21 04:33:11.294589 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.294462 2575 scope.go:117] "RemoveContainer" containerID="fbb876135871360648d09c08152301c2f1c1fc166c23026bb82506816da57e94" Apr 21 04:33:11.294589 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.294465 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt" Apr 21 04:33:11.301655 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.301640 2575 scope.go:117] "RemoveContainer" containerID="cf0ab84095956d5d441ec6cc1772646d259d85f59be63fc70a045a8ba529dd3f" Apr 21 04:33:11.308137 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.308122 2575 scope.go:117] "RemoveContainer" containerID="46e3ee5ab32e9c5ef6eeb0616b4c7f06949c13facb887489c22e5e605b097d85" Apr 21 04:33:11.311730 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.311689 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podStartSLOduration=6.311674099 podStartE2EDuration="6.311674099s" podCreationTimestamp="2026-04-21 04:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:33:11.310641689 +0000 UTC m=+2176.975231776" watchObservedRunningTime="2026-04-21 04:33:11.311674099 +0000 UTC m=+2176.976264193" Apr 21 04:33:11.322872 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.322852 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt"] Apr 21 04:33:11.326435 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:11.326414 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-4pldt"] Apr 21 04:33:12.297504 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:12.297480 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:13.027841 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:13.027799 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59fdec78-a20e-4234-832b-b10aa390e635" path="/var/lib/kubelet/pods/59fdec78-a20e-4234-832b-b10aa390e635/volumes" Apr 21 04:33:18.305650 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:18.305624 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:33:48.306778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:48.306730 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 04:33:58.306920 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:33:58.306876 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 04:34:08.307005 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:08.306968 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 04:34:18.306253 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:18.306167 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 04:34:20.026822 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:20.026780 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:34:26.005904 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.005871 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd"] Apr 21 04:34:26.006387 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.006232 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" containerID="cri-o://48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c" gracePeriod=30 Apr 21 04:34:26.006387 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.006258 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kube-rbac-proxy" containerID="cri-o://83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003" gracePeriod=30 Apr 21 04:34:26.074963 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.074927 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4"] Apr 21 04:34:26.075302 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075286 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="storage-initializer" Apr 21 04:34:26.075352 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075304 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="storage-initializer" Apr 21 04:34:26.075352 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075317 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" Apr 21 04:34:26.075352 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075323 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" Apr 21 04:34:26.075352 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075339 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kube-rbac-proxy" Apr 21 04:34:26.075352 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075346 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kube-rbac-proxy" Apr 21 04:34:26.075507 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075392 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kserve-container" Apr 21 04:34:26.075507 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.075400 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="59fdec78-a20e-4234-832b-b10aa390e635" containerName="kube-rbac-proxy" Apr 21 04:34:26.078569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.078552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.081287 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.081265 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 21 04:34:26.081390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.081297 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 21 04:34:26.087958 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.087936 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4"] Apr 21 04:34:26.169825 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.169747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjk8b\" (UniqueName: \"kubernetes.io/projected/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kube-api-access-xjk8b\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.169967 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.169852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d39ed7c8-b6d3-453a-8bfb-77825958a03d-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.169967 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.169888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.169967 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.169913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d39ed7c8-b6d3-453a-8bfb-77825958a03d-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.271290 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.271214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjk8b\" (UniqueName: \"kubernetes.io/projected/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kube-api-access-xjk8b\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.271290 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.271250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d39ed7c8-b6d3-453a-8bfb-77825958a03d-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.271290 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.271279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.271547 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.271311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d39ed7c8-b6d3-453a-8bfb-77825958a03d-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.271861 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.271834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.272065 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.272046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d39ed7c8-b6d3-453a-8bfb-77825958a03d-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.273623 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.273596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d39ed7c8-b6d3-453a-8bfb-77825958a03d-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.279807 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.279776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjk8b\" (UniqueName: \"kubernetes.io/projected/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kube-api-access-xjk8b\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.389149 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.389127 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:26.492313 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.492282 2575 generic.go:358] "Generic (PLEG): container finished" podID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerID="83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003" exitCode=2 Apr 21 04:34:26.492470 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.492353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerDied","Data":"83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003"} Apr 21 04:34:26.504651 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:26.504533 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4"] Apr 21 04:34:26.507145 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:34:26.507118 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd39ed7c8_b6d3_453a_8bfb_77825958a03d.slice/crio-50895accce241cb11899293884841aa33a1aa258af19d2772b8c1d84f51cb7dd WatchSource:0}: Error finding container 50895accce241cb11899293884841aa33a1aa258af19d2772b8c1d84f51cb7dd: Status 404 returned error can't find the container with id 50895accce241cb11899293884841aa33a1aa258af19d2772b8c1d84f51cb7dd Apr 21 04:34:27.502034 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:27.501998 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerStarted","Data":"594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155"} Apr 21 04:34:27.502034 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:27.502035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerStarted","Data":"50895accce241cb11899293884841aa33a1aa258af19d2772b8c1d84f51cb7dd"} Apr 21 04:34:28.301528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:28.301487 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 21 04:34:30.024097 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.024058 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 21 04:34:30.253800 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.253777 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:34:30.302093 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.302031 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bf581ad-49f7-44c9-a615-58c0a766c136-kserve-provision-location\") pod \"8bf581ad-49f7-44c9-a615-58c0a766c136\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " Apr 21 04:34:30.302093 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.302059 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb5w8\" (UniqueName: \"kubernetes.io/projected/8bf581ad-49f7-44c9-a615-58c0a766c136-kube-api-access-kb5w8\") pod \"8bf581ad-49f7-44c9-a615-58c0a766c136\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " Apr 21 04:34:30.302298 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.302099 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bf581ad-49f7-44c9-a615-58c0a766c136-proxy-tls\") pod \"8bf581ad-49f7-44c9-a615-58c0a766c136\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " Apr 21 04:34:30.302298 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.302222 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bf581ad-49f7-44c9-a615-58c0a766c136-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"8bf581ad-49f7-44c9-a615-58c0a766c136\" (UID: \"8bf581ad-49f7-44c9-a615-58c0a766c136\") " Apr 21 04:34:30.302410 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.302357 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf581ad-49f7-44c9-a615-58c0a766c136-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8bf581ad-49f7-44c9-a615-58c0a766c136" (UID: "8bf581ad-49f7-44c9-a615-58c0a766c136"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:34:30.302543 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.302522 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bf581ad-49f7-44c9-a615-58c0a766c136-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:34:30.302611 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.302573 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf581ad-49f7-44c9-a615-58c0a766c136-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "8bf581ad-49f7-44c9-a615-58c0a766c136" (UID: "8bf581ad-49f7-44c9-a615-58c0a766c136"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:34:30.304252 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.304227 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf581ad-49f7-44c9-a615-58c0a766c136-kube-api-access-kb5w8" (OuterVolumeSpecName: "kube-api-access-kb5w8") pod "8bf581ad-49f7-44c9-a615-58c0a766c136" (UID: "8bf581ad-49f7-44c9-a615-58c0a766c136"). InnerVolumeSpecName "kube-api-access-kb5w8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:34:30.304657 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.304639 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf581ad-49f7-44c9-a615-58c0a766c136-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8bf581ad-49f7-44c9-a615-58c0a766c136" (UID: "8bf581ad-49f7-44c9-a615-58c0a766c136"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:34:30.403206 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.403176 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bf581ad-49f7-44c9-a615-58c0a766c136-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:34:30.403206 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.403203 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kb5w8\" (UniqueName: \"kubernetes.io/projected/8bf581ad-49f7-44c9-a615-58c0a766c136-kube-api-access-kb5w8\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:34:30.403348 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.403217 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bf581ad-49f7-44c9-a615-58c0a766c136-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:34:30.512113 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.512085 2575 generic.go:358] "Generic (PLEG): container finished" podID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerID="594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155" exitCode=0 Apr 21 04:34:30.512222 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.512163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerDied","Data":"594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155"} Apr 21 04:34:30.513946 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.513915 2575 generic.go:358] "Generic (PLEG): container finished" podID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerID="48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c" exitCode=0 Apr 21 04:34:30.514040 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.513954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerDied","Data":"48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c"} Apr 21 04:34:30.514040 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.513972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" event={"ID":"8bf581ad-49f7-44c9-a615-58c0a766c136","Type":"ContainerDied","Data":"8c992fdf559fb478b470be77f044bbf67fa778888aafbf3f64941966f245d966"} Apr 21 04:34:30.514040 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.513989 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd" Apr 21 04:34:30.514195 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.513991 2575 scope.go:117] "RemoveContainer" containerID="83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003" Apr 21 04:34:30.521585 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.521564 2575 scope.go:117] "RemoveContainer" containerID="48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c" Apr 21 04:34:30.528513 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.528497 2575 scope.go:117] "RemoveContainer" containerID="e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d" Apr 21 04:34:30.535520 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.535501 2575 scope.go:117] "RemoveContainer" containerID="83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003" Apr 21 04:34:30.535843 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:34:30.535815 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003\": container with ID starting with 83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003 not found: ID does not exist" containerID="83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003" Apr 21 04:34:30.535917 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.535856 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003"} err="failed to get container status \"83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003\": rpc error: code = NotFound desc = could not find container \"83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003\": container with ID starting with 83106098ec72954b6156213c49e2e6c248f7aef175635a3a738b2505064c5003 not found: ID does not exist" Apr 21 04:34:30.535917 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.535883 2575 scope.go:117] "RemoveContainer" containerID="48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c" Apr 21 04:34:30.536127 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:34:30.536108 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c\": container with ID starting with 48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c not found: ID does not exist" containerID="48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c" Apr 21 04:34:30.536170 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.536137 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c"} err="failed to get container status \"48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c\": rpc error: code = NotFound desc = could not find container \"48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c\": container with ID starting with 48af0af165150ab6d2c808b7b44cd968af3199f99824ad95e904670c0db5c40c not found: ID does not exist" Apr 21 04:34:30.536170 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.536157 2575 scope.go:117] "RemoveContainer" containerID="e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d" Apr 21 04:34:30.536389 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:34:30.536365 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d\": container with ID starting with e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d not found: ID does not exist" containerID="e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d" Apr 21 04:34:30.536460 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.536397 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d"} err="failed to get container status \"e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d\": rpc error: code = NotFound desc = could not find container \"e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d\": container with ID starting with e2b79dc156363a84317cc5551c8524b7f6731dd54ead09023ebd5fab55b7ce0d not found: ID does not exist" Apr 21 04:34:30.549421 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.549399 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd"] Apr 21 04:34:30.553102 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:30.553047 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-qtfgd"] Apr 21 04:34:31.028193 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:31.028162 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" path="/var/lib/kubelet/pods/8bf581ad-49f7-44c9-a615-58c0a766c136/volumes" Apr 21 04:34:31.519892 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:31.519860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerStarted","Data":"1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7"} Apr 21 04:34:31.519892 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:31.519893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerStarted","Data":"dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07"} Apr 21 04:34:31.520099 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:31.520086 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:31.538155 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:31.538114 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podStartSLOduration=5.538100399 podStartE2EDuration="5.538100399s" podCreationTimestamp="2026-04-21 04:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:34:31.537259158 +0000 UTC m=+2257.201849243" watchObservedRunningTime="2026-04-21 04:34:31.538100399 +0000 UTC m=+2257.202690485" Apr 21 04:34:32.522404 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:32.522379 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:34:38.530854 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:34:38.530825 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:35:08.531433 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:08.531396 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 04:35:18.532161 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:18.532125 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 04:35:28.531616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:28.531574 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 04:35:38.532110 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:38.532068 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 04:35:48.535438 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:48.535351 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:35:56.217814 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.217780 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4"] Apr 21 04:35:56.218294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.218191 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" containerID="cri-o://dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07" gracePeriod=30 Apr 21 04:35:56.218294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.218210 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kube-rbac-proxy" containerID="cri-o://1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7" gracePeriod=30 Apr 21 04:35:56.330814 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.330787 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28"] Apr 21 04:35:56.331156 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331141 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="storage-initializer" Apr 21 04:35:56.331200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331159 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="storage-initializer" Apr 21 04:35:56.331200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331178 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" Apr 21 04:35:56.331200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331184 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" Apr 21 04:35:56.331200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331197 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kube-rbac-proxy" Apr 21 04:35:56.331323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331203 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kube-rbac-proxy" Apr 21 04:35:56.331323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331256 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kserve-container" Apr 21 04:35:56.331323 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.331265 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bf581ad-49f7-44c9-a615-58c0a766c136" containerName="kube-rbac-proxy" Apr 21 04:35:56.334875 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.334859 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.337340 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.337322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 21 04:35:56.337421 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.337322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 21 04:35:56.344147 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.344127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28"] Apr 21 04:35:56.402975 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.402950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.403070 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.402997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.403128 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.403106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.403177 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.403142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jggb\" (UniqueName: \"kubernetes.io/projected/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kube-api-access-2jggb\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.503877 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.503819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.503877 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.503850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jggb\" (UniqueName: \"kubernetes.io/projected/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kube-api-access-2jggb\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.504004 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.503887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.504004 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.503913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.504100 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:35:56.504003 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-serving-cert: secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 21 04:35:56.504100 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:35:56.504062 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls podName:b0d80896-6bd3-4068-95d1-7ec1bfcc1eee nodeName:}" failed. No retries permitted until 2026-04-21 04:35:57.004046659 +0000 UTC m=+2342.668636734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls") pod "isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" (UID: "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee") : secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 21 04:35:56.504273 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.504255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.504497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.504478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.513171 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.513145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jggb\" (UniqueName: \"kubernetes.io/projected/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kube-api-access-2jggb\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:56.749438 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.749408 2575 generic.go:358] "Generic (PLEG): container finished" podID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerID="1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7" exitCode=2 Apr 21 04:35:56.749598 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:56.749471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerDied","Data":"1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7"} Apr 21 04:35:57.007425 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:57.007397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:57.009779 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:57.009739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:57.245357 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:57.245321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:35:57.361604 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:57.361577 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28"] Apr 21 04:35:57.363697 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:35:57.363655 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d80896_6bd3_4068_95d1_7ec1bfcc1eee.slice/crio-72c76577c85010cfcf7de71fb4ebb2bee87645162c3bebef92c5c69838a09284 WatchSource:0}: Error finding container 72c76577c85010cfcf7de71fb4ebb2bee87645162c3bebef92c5c69838a09284: Status 404 returned error can't find the container with id 72c76577c85010cfcf7de71fb4ebb2bee87645162c3bebef92c5c69838a09284 Apr 21 04:35:57.753263 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:57.753227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerStarted","Data":"81d054289fc2e533bcd2ef86e6ee38e034255f4ce1ed640a3764348a8c0a1cad"} Apr 21 04:35:57.753425 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:57.753269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerStarted","Data":"72c76577c85010cfcf7de71fb4ebb2bee87645162c3bebef92c5c69838a09284"} Apr 21 04:35:58.525078 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:58.525038 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.44:8643/healthz\": dial tcp 10.132.0.44:8643: connect: connection refused" Apr 21 04:35:58.532221 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:35:58.532188 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 21 04:36:00.458002 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.457980 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:36:00.530181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.530119 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kserve-provision-location\") pod \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " Apr 21 04:36:00.530181 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.530155 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjk8b\" (UniqueName: \"kubernetes.io/projected/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kube-api-access-xjk8b\") pod \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " Apr 21 04:36:00.530388 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.530231 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d39ed7c8-b6d3-453a-8bfb-77825958a03d-proxy-tls\") pod \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " Apr 21 04:36:00.530388 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.530262 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d39ed7c8-b6d3-453a-8bfb-77825958a03d-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\" (UID: \"d39ed7c8-b6d3-453a-8bfb-77825958a03d\") " Apr 21 04:36:00.530508 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.530470 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d39ed7c8-b6d3-453a-8bfb-77825958a03d" (UID: "d39ed7c8-b6d3-453a-8bfb-77825958a03d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:36:00.530660 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.530630 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39ed7c8-b6d3-453a-8bfb-77825958a03d-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "d39ed7c8-b6d3-453a-8bfb-77825958a03d" (UID: "d39ed7c8-b6d3-453a-8bfb-77825958a03d"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:36:00.532142 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.532112 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kube-api-access-xjk8b" (OuterVolumeSpecName: "kube-api-access-xjk8b") pod "d39ed7c8-b6d3-453a-8bfb-77825958a03d" (UID: "d39ed7c8-b6d3-453a-8bfb-77825958a03d"). InnerVolumeSpecName "kube-api-access-xjk8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:36:00.532232 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.532139 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39ed7c8-b6d3-453a-8bfb-77825958a03d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d39ed7c8-b6d3-453a-8bfb-77825958a03d" (UID: "d39ed7c8-b6d3-453a-8bfb-77825958a03d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:36:00.631160 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.631133 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:36:00.631160 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.631158 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xjk8b\" (UniqueName: \"kubernetes.io/projected/d39ed7c8-b6d3-453a-8bfb-77825958a03d-kube-api-access-xjk8b\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:36:00.631297 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.631171 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d39ed7c8-b6d3-453a-8bfb-77825958a03d-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:36:00.631297 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.631187 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d39ed7c8-b6d3-453a-8bfb-77825958a03d-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:36:00.762342 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.762315 2575 generic.go:358] "Generic (PLEG): container finished" podID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerID="dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07" exitCode=0 Apr 21 04:36:00.762468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.762394 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" Apr 21 04:36:00.762468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.762410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerDied","Data":"dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07"} Apr 21 04:36:00.762468 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.762457 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4" event={"ID":"d39ed7c8-b6d3-453a-8bfb-77825958a03d","Type":"ContainerDied","Data":"50895accce241cb11899293884841aa33a1aa258af19d2772b8c1d84f51cb7dd"} Apr 21 04:36:00.762636 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.762478 2575 scope.go:117] "RemoveContainer" containerID="1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7" Apr 21 04:36:00.775687 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.775665 2575 scope.go:117] "RemoveContainer" containerID="dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07" Apr 21 04:36:00.782423 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.782406 2575 scope.go:117] "RemoveContainer" containerID="594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155" Apr 21 04:36:00.787912 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.787892 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4"] Apr 21 04:36:00.791099 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.791077 2575 scope.go:117] "RemoveContainer" containerID="1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7" Apr 21 04:36:00.791556 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.791394 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-xqcl4"] Apr 21 04:36:00.791556 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:36:00.791441 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7\": container with ID starting with 1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7 not found: ID does not exist" containerID="1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7" Apr 21 04:36:00.791556 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.791470 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7"} err="failed to get container status \"1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7\": rpc error: code = NotFound desc = could not find container \"1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7\": container with ID starting with 1a016b0af6b7b56ab9ba31ef86c09e7173c32f8abe0b2d3db0832ce84c5211c7 not found: ID does not exist" Apr 21 04:36:00.791556 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.791493 2575 scope.go:117] "RemoveContainer" containerID="dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07" Apr 21 04:36:00.792861 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:36:00.792839 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07\": container with ID starting with dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07 not found: ID does not exist" containerID="dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07" Apr 21 04:36:00.792947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.792865 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07"} err="failed to get container status \"dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07\": rpc error: code = NotFound desc = could not find container \"dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07\": container with ID starting with dc30b2f5b207e39aa0720fada19df22074b039b69c94812df8935fcc6a98cf07 not found: ID does not exist" Apr 21 04:36:00.792947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.792881 2575 scope.go:117] "RemoveContainer" containerID="594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155" Apr 21 04:36:00.793314 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:36:00.793298 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155\": container with ID starting with 594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155 not found: ID does not exist" containerID="594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155" Apr 21 04:36:00.793403 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:00.793318 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155"} err="failed to get container status \"594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155\": rpc error: code = NotFound desc = could not find container \"594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155\": container with ID starting with 594485afc2c37a18c32c5349b55cd46faa086224f4b7407c439ffe5318dfb155 not found: ID does not exist" Apr 21 04:36:01.028201 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:01.028175 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" path="/var/lib/kubelet/pods/d39ed7c8-b6d3-453a-8bfb-77825958a03d/volumes" Apr 21 04:36:01.766152 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:01.766120 2575 generic.go:358] "Generic (PLEG): container finished" podID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerID="81d054289fc2e533bcd2ef86e6ee38e034255f4ce1ed640a3764348a8c0a1cad" exitCode=0 Apr 21 04:36:01.766539 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:01.766194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerDied","Data":"81d054289fc2e533bcd2ef86e6ee38e034255f4ce1ed640a3764348a8c0a1cad"} Apr 21 04:36:02.771200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:02.771168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerStarted","Data":"0a7aa55e4e4b93c2e1c09c6c838e8219ba9811d80e14382c66cb855090483f24"} Apr 21 04:36:02.771587 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:02.771209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerStarted","Data":"7df3c2a5e835d28a513997801bff5eadf73222c495aca45d6e39fcef122877cb"} Apr 21 04:36:02.771587 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:02.771403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:36:02.788560 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:02.788522 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podStartSLOduration=6.788509275 podStartE2EDuration="6.788509275s" podCreationTimestamp="2026-04-21 04:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:36:02.787693581 +0000 UTC m=+2348.452283667" watchObservedRunningTime="2026-04-21 04:36:02.788509275 +0000 UTC m=+2348.453099361" Apr 21 04:36:03.774308 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:03.774277 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:36:09.782090 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:09.782058 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:36:39.783386 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:39.783349 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 04:36:49.783382 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:49.783340 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 04:36:59.783410 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:36:59.783371 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 04:37:09.783131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:09.783089 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 04:37:19.786562 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:19.786482 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:37:26.600498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:26.600468 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28"] Apr 21 04:37:26.600992 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:26.600773 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" containerID="cri-o://7df3c2a5e835d28a513997801bff5eadf73222c495aca45d6e39fcef122877cb" gracePeriod=30 Apr 21 04:37:26.600992 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:26.600793 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kube-rbac-proxy" containerID="cri-o://0a7aa55e4e4b93c2e1c09c6c838e8219ba9811d80e14382c66cb855090483f24" gracePeriod=30 Apr 21 04:37:27.008446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:27.008409 2575 generic.go:358] "Generic (PLEG): container finished" podID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerID="0a7aa55e4e4b93c2e1c09c6c838e8219ba9811d80e14382c66cb855090483f24" exitCode=2 Apr 21 04:37:27.008609 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:27.008461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerDied","Data":"0a7aa55e4e4b93c2e1c09c6c838e8219ba9811d80e14382c66cb855090483f24"} Apr 21 04:37:28.790139 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790107 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn"] Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790395 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="storage-initializer" Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790406 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="storage-initializer" Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790414 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kube-rbac-proxy" Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790420 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kube-rbac-proxy" Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790436 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790441 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790490 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kserve-container" Apr 21 04:37:28.792361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.790500 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39ed7c8-b6d3-453a-8bfb-77825958a03d" containerName="kube-rbac-proxy" Apr 21 04:37:28.793414 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.793399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.796109 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.796086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 21 04:37:28.796233 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.796125 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 21 04:37:28.802549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.802529 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn"] Apr 21 04:37:28.846432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.846411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/950bb92b-7654-40d2-9fec-a09d5464bc13-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.846542 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.846441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zb8q\" (UniqueName: \"kubernetes.io/projected/950bb92b-7654-40d2-9fec-a09d5464bc13-kube-api-access-8zb8q\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.846542 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.846464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.846638 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.846551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/950bb92b-7654-40d2-9fec-a09d5464bc13-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.947548 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.947522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/950bb92b-7654-40d2-9fec-a09d5464bc13-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.947667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.947582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/950bb92b-7654-40d2-9fec-a09d5464bc13-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.947667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.947615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zb8q\" (UniqueName: \"kubernetes.io/projected/950bb92b-7654-40d2-9fec-a09d5464bc13-kube-api-access-8zb8q\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.947667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.947650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.947880 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:37:28.947742 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-predictor-serving-cert: secret "isvc-sklearn-predictor-serving-cert" not found Apr 21 04:37:28.947880 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:37:28.947837 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls podName:950bb92b-7654-40d2-9fec-a09d5464bc13 nodeName:}" failed. No retries permitted until 2026-04-21 04:37:29.447815822 +0000 UTC m=+2435.112405893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls") pod "isvc-sklearn-predictor-d8dbfbbb9-gn6xn" (UID: "950bb92b-7654-40d2-9fec-a09d5464bc13") : secret "isvc-sklearn-predictor-serving-cert" not found Apr 21 04:37:28.948048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.948028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/950bb92b-7654-40d2-9fec-a09d5464bc13-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.948280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.948263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/950bb92b-7654-40d2-9fec-a09d5464bc13-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:28.956470 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:28.956451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zb8q\" (UniqueName: \"kubernetes.io/projected/950bb92b-7654-40d2-9fec-a09d5464bc13-kube-api-access-8zb8q\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:29.450256 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:29.450220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:29.452565 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:29.452546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gn6xn\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:29.704505 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:29.704417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:29.777989 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:29.777954 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 21 04:37:29.782969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:29.782917 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 21 04:37:29.823930 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:29.823906 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn"] Apr 21 04:37:29.826072 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:37:29.826046 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod950bb92b_7654_40d2_9fec_a09d5464bc13.slice/crio-f32b26fe6412f6b9b38c74376052086a0d6c25eeced80b4df67cc2d4946d18fc WatchSource:0}: Error finding container f32b26fe6412f6b9b38c74376052086a0d6c25eeced80b4df67cc2d4946d18fc: Status 404 returned error can't find the container with id f32b26fe6412f6b9b38c74376052086a0d6c25eeced80b4df67cc2d4946d18fc Apr 21 04:37:29.827856 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:29.827839 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:37:30.018114 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:30.018032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerStarted","Data":"159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4"} Apr 21 04:37:30.018114 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:30.018071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerStarted","Data":"f32b26fe6412f6b9b38c74376052086a0d6c25eeced80b4df67cc2d4946d18fc"} Apr 21 04:37:31.022593 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.022562 2575 generic.go:358] "Generic (PLEG): container finished" podID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerID="7df3c2a5e835d28a513997801bff5eadf73222c495aca45d6e39fcef122877cb" exitCode=0 Apr 21 04:37:31.023041 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.022617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerDied","Data":"7df3c2a5e835d28a513997801bff5eadf73222c495aca45d6e39fcef122877cb"} Apr 21 04:37:31.439642 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.439623 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:37:31.464235 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.464210 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jggb\" (UniqueName: \"kubernetes.io/projected/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kube-api-access-2jggb\") pod \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " Apr 21 04:37:31.464340 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.464266 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kserve-provision-location\") pod \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " Apr 21 04:37:31.464340 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.464321 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls\") pod \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " Apr 21 04:37:31.464455 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.464353 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\" (UID: \"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee\") " Apr 21 04:37:31.464648 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.464622 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" (UID: "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:37:31.464798 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.464689 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" (UID: "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:37:31.466335 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.466308 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kube-api-access-2jggb" (OuterVolumeSpecName: "kube-api-access-2jggb") pod "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" (UID: "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee"). InnerVolumeSpecName "kube-api-access-2jggb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:37:31.466420 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.466340 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" (UID: "b0d80896-6bd3-4068-95d1-7ec1bfcc1eee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:37:31.565381 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.565325 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2jggb\" (UniqueName: \"kubernetes.io/projected/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kube-api-access-2jggb\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:37:31.565381 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.565349 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:37:31.565381 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.565359 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:37:31.565381 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:31.565370 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:37:32.026855 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:32.026823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" event={"ID":"b0d80896-6bd3-4068-95d1-7ec1bfcc1eee","Type":"ContainerDied","Data":"72c76577c85010cfcf7de71fb4ebb2bee87645162c3bebef92c5c69838a09284"} Apr 21 04:37:32.027267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:32.026869 2575 scope.go:117] "RemoveContainer" containerID="0a7aa55e4e4b93c2e1c09c6c838e8219ba9811d80e14382c66cb855090483f24" Apr 21 04:37:32.027267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:32.026885 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28" Apr 21 04:37:32.034958 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:32.034937 2575 scope.go:117] "RemoveContainer" containerID="7df3c2a5e835d28a513997801bff5eadf73222c495aca45d6e39fcef122877cb" Apr 21 04:37:32.041858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:32.041840 2575 scope.go:117] "RemoveContainer" containerID="81d054289fc2e533bcd2ef86e6ee38e034255f4ce1ed640a3764348a8c0a1cad" Apr 21 04:37:32.048901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:32.048874 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28"] Apr 21 04:37:32.055300 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:32.055276 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-wtq28"] Apr 21 04:37:33.027525 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:33.027491 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" path="/var/lib/kubelet/pods/b0d80896-6bd3-4068-95d1-7ec1bfcc1eee/volumes" Apr 21 04:37:34.033910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:34.033818 2575 generic.go:358] "Generic (PLEG): container finished" podID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerID="159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4" exitCode=0 Apr 21 04:37:34.033910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:34.033894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerDied","Data":"159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4"} Apr 21 04:37:35.038768 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:35.038720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerStarted","Data":"042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230"} Apr 21 04:37:35.038768 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:35.038752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerStarted","Data":"c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3"} Apr 21 04:37:35.039239 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:35.039034 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:35.039239 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:35.039162 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:35.040463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:35.040434 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:37:35.057471 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:35.057431 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podStartSLOduration=7.0574213 podStartE2EDuration="7.0574213s" podCreationTimestamp="2026-04-21 04:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:37:35.055498335 +0000 UTC m=+2440.720088420" watchObservedRunningTime="2026-04-21 04:37:35.0574213 +0000 UTC m=+2440.722011385" Apr 21 04:37:36.042018 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:36.041975 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:37:41.046664 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:41.046636 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:37:41.047224 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:41.047197 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:37:51.047285 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:37:51.047243 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:38:01.047562 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:01.047524 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:38:11.047684 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:11.047648 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:38:21.047491 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:21.047452 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:38:31.048048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:31.048007 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:38:41.048521 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:41.048493 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:38:48.918674 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.918581 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn"] Apr 21 04:38:48.919218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.918949 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" containerID="cri-o://c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3" gracePeriod=30 Apr 21 04:38:48.919218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.918969 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kube-rbac-proxy" containerID="cri-o://042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230" gracePeriod=30 Apr 21 04:38:48.988912 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.988875 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf"] Apr 21 04:38:48.989223 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989206 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kube-rbac-proxy" Apr 21 04:38:48.989312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989226 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kube-rbac-proxy" Apr 21 04:38:48.989312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989237 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" Apr 21 04:38:48.989312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989245 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" Apr 21 04:38:48.989312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989257 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="storage-initializer" Apr 21 04:38:48.989312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989266 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="storage-initializer" Apr 21 04:38:48.989571 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989351 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kserve-container" Apr 21 04:38:48.989571 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.989366 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0d80896-6bd3-4068-95d1-7ec1bfcc1eee" containerName="kube-rbac-proxy" Apr 21 04:38:48.992536 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.992514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:48.995110 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.995074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 21 04:38:48.995234 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:48.995132 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 21 04:38:49.001796 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.001753 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf"] Apr 21 04:38:49.054457 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.054431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc429\" (UniqueName: \"kubernetes.io/projected/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kube-api-access-xc429\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.054574 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.054463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.054636 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.054599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.054695 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.054634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.155672 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.155647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.155866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.155678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.155866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.155709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc429\" (UniqueName: \"kubernetes.io/projected/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kube-api-access-xc429\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.155866 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.155729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.156115 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.156096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.156392 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.156371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.158216 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.158193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.164485 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.164465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc429\" (UniqueName: \"kubernetes.io/projected/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kube-api-access-xc429\") pod \"sklearn-v2-mlserver-predictor-65d8664766-hkmsf\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.226294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.226222 2575 generic.go:358] "Generic (PLEG): container finished" podID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerID="042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230" exitCode=2 Apr 21 04:38:49.226294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.226272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerDied","Data":"042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230"} Apr 21 04:38:49.303930 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.303902 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:49.425334 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:49.425306 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf"] Apr 21 04:38:49.427710 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:38:49.427681 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9309b908_6b04_4a34_b6d8_3a1ceb15acc7.slice/crio-0c93fa0ee0c515502e8e2ec531595c2aef84b369a90b9f33c8d66fc88ae39bd2 WatchSource:0}: Error finding container 0c93fa0ee0c515502e8e2ec531595c2aef84b369a90b9f33c8d66fc88ae39bd2: Status 404 returned error can't find the container with id 0c93fa0ee0c515502e8e2ec531595c2aef84b369a90b9f33c8d66fc88ae39bd2 Apr 21 04:38:50.230569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:50.230521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerStarted","Data":"1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1"} Apr 21 04:38:50.230569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:50.230568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerStarted","Data":"0c93fa0ee0c515502e8e2ec531595c2aef84b369a90b9f33c8d66fc88ae39bd2"} Apr 21 04:38:51.042275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:51.042236 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 21 04:38:51.047617 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:51.047584 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 21 04:38:53.156544 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.156517 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:38:53.188311 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.188288 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zb8q\" (UniqueName: \"kubernetes.io/projected/950bb92b-7654-40d2-9fec-a09d5464bc13-kube-api-access-8zb8q\") pod \"950bb92b-7654-40d2-9fec-a09d5464bc13\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " Apr 21 04:38:53.188467 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.188342 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls\") pod \"950bb92b-7654-40d2-9fec-a09d5464bc13\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " Apr 21 04:38:53.188467 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.188385 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/950bb92b-7654-40d2-9fec-a09d5464bc13-kserve-provision-location\") pod \"950bb92b-7654-40d2-9fec-a09d5464bc13\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " Apr 21 04:38:53.188603 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.188464 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/950bb92b-7654-40d2-9fec-a09d5464bc13-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"950bb92b-7654-40d2-9fec-a09d5464bc13\" (UID: \"950bb92b-7654-40d2-9fec-a09d5464bc13\") " Apr 21 04:38:53.188798 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.188749 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/950bb92b-7654-40d2-9fec-a09d5464bc13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "950bb92b-7654-40d2-9fec-a09d5464bc13" (UID: "950bb92b-7654-40d2-9fec-a09d5464bc13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:38:53.188905 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.188814 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/950bb92b-7654-40d2-9fec-a09d5464bc13-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "950bb92b-7654-40d2-9fec-a09d5464bc13" (UID: "950bb92b-7654-40d2-9fec-a09d5464bc13"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:38:53.190380 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.190354 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/950bb92b-7654-40d2-9fec-a09d5464bc13-kube-api-access-8zb8q" (OuterVolumeSpecName: "kube-api-access-8zb8q") pod "950bb92b-7654-40d2-9fec-a09d5464bc13" (UID: "950bb92b-7654-40d2-9fec-a09d5464bc13"). InnerVolumeSpecName "kube-api-access-8zb8q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:38:53.190380 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.190370 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "950bb92b-7654-40d2-9fec-a09d5464bc13" (UID: "950bb92b-7654-40d2-9fec-a09d5464bc13"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:38:53.240830 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.240800 2575 generic.go:358] "Generic (PLEG): container finished" podID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerID="c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3" exitCode=0 Apr 21 04:38:53.240961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.240841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerDied","Data":"c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3"} Apr 21 04:38:53.240961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.240864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" event={"ID":"950bb92b-7654-40d2-9fec-a09d5464bc13","Type":"ContainerDied","Data":"f32b26fe6412f6b9b38c74376052086a0d6c25eeced80b4df67cc2d4946d18fc"} Apr 21 04:38:53.240961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.240883 2575 scope.go:117] "RemoveContainer" containerID="042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230" Apr 21 04:38:53.240961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.240889 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn" Apr 21 04:38:53.251915 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.251866 2575 scope.go:117] "RemoveContainer" containerID="c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3" Apr 21 04:38:53.259160 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.259140 2575 scope.go:117] "RemoveContainer" containerID="159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4" Apr 21 04:38:53.264585 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.264562 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn"] Apr 21 04:38:53.266294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.266280 2575 scope.go:117] "RemoveContainer" containerID="042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230" Apr 21 04:38:53.266535 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:38:53.266518 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230\": container with ID starting with 042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230 not found: ID does not exist" containerID="042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230" Apr 21 04:38:53.266606 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.266545 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230"} err="failed to get container status \"042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230\": rpc error: code = NotFound desc = could not find container \"042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230\": container with ID starting with 042b2f87aa871097216eb30026a0ddaca48d51943257e8ce24cd33c5e4fda230 not found: ID does not exist" Apr 21 04:38:53.266606 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.266569 2575 scope.go:117] "RemoveContainer" containerID="c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3" Apr 21 04:38:53.266814 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:38:53.266796 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3\": container with ID starting with c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3 not found: ID does not exist" containerID="c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3" Apr 21 04:38:53.266867 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.266820 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3"} err="failed to get container status \"c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3\": rpc error: code = NotFound desc = could not find container \"c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3\": container with ID starting with c8cbe795878217c6025e0d6e13e9a093ddf4e3621d494a6ecbb38283129628b3 not found: ID does not exist" Apr 21 04:38:53.266867 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.266836 2575 scope.go:117] "RemoveContainer" containerID="159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4" Apr 21 04:38:53.267072 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:38:53.267048 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4\": container with ID starting with 159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4 not found: ID does not exist" containerID="159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4" Apr 21 04:38:53.267112 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.267082 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4"} err="failed to get container status \"159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4\": rpc error: code = NotFound desc = could not find container \"159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4\": container with ID starting with 159ae964d659499b127609c9fae12096e6f772fca1f648e95ebd903156d6dbb4 not found: ID does not exist" Apr 21 04:38:53.270219 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.270196 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gn6xn"] Apr 21 04:38:53.289549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.289531 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/950bb92b-7654-40d2-9fec-a09d5464bc13-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:38:53.289549 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.289550 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/950bb92b-7654-40d2-9fec-a09d5464bc13-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:38:53.289665 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.289562 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/950bb92b-7654-40d2-9fec-a09d5464bc13-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:38:53.289665 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:53.289572 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zb8q\" (UniqueName: \"kubernetes.io/projected/950bb92b-7654-40d2-9fec-a09d5464bc13-kube-api-access-8zb8q\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:38:54.244639 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:54.244557 2575 generic.go:358] "Generic (PLEG): container finished" podID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerID="1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1" exitCode=0 Apr 21 04:38:54.244639 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:54.244631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerDied","Data":"1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1"} Apr 21 04:38:55.027506 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:55.027470 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" path="/var/lib/kubelet/pods/950bb92b-7654-40d2-9fec-a09d5464bc13/volumes" Apr 21 04:38:55.250000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:55.249965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerStarted","Data":"5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51"} Apr 21 04:38:55.250000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:55.250002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerStarted","Data":"c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62"} Apr 21 04:38:55.250388 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:55.250232 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:55.250388 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:55.250284 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:38:55.269446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:38:55.269401 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" podStartSLOduration=7.269389549 podStartE2EDuration="7.269389549s" podCreationTimestamp="2026-04-21 04:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:38:55.268150683 +0000 UTC m=+2520.932740768" watchObservedRunningTime="2026-04-21 04:38:55.269389549 +0000 UTC m=+2520.933979636" Apr 21 04:39:01.258776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:01.258726 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:39:31.338706 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:31.338665 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 04:39:41.261436 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:41.261404 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:39:49.068550 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.068515 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf"] Apr 21 04:39:49.068959 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.068834 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kserve-container" containerID="cri-o://c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62" gracePeriod=30 Apr 21 04:39:49.068959 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.068889 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kube-rbac-proxy" containerID="cri-o://5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51" gracePeriod=30 Apr 21 04:39:49.151112 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151083 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs"] Apr 21 04:39:49.151373 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151361 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kube-rbac-proxy" Apr 21 04:39:49.151418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151374 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kube-rbac-proxy" Apr 21 04:39:49.151418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151385 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" Apr 21 04:39:49.151418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151390 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" Apr 21 04:39:49.151418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151406 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="storage-initializer" Apr 21 04:39:49.151418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151412 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="storage-initializer" Apr 21 04:39:49.151579 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151451 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kserve-container" Apr 21 04:39:49.151579 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.151459 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="950bb92b-7654-40d2-9fec-a09d5464bc13" containerName="kube-rbac-proxy" Apr 21 04:39:49.154577 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.154555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.157037 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.157017 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 21 04:39:49.157150 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.157028 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:39:49.162296 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.162274 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs"] Apr 21 04:39:49.270212 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.270169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hqb\" (UniqueName: \"kubernetes.io/projected/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kube-api-access-p4hqb\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.270368 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.270222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.270368 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.270302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.270368 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.270363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.371551 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.371522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.371720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.371566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.371720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.371609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hqb\" (UniqueName: \"kubernetes.io/projected/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kube-api-access-p4hqb\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.371720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.371626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.372003 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.371976 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.372339 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.372318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.374126 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.374109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.379082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.379064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hqb\" (UniqueName: \"kubernetes.io/projected/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kube-api-access-p4hqb\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-b8hxs\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.390858 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.390829 2575 generic.go:358] "Generic (PLEG): container finished" podID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerID="5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51" exitCode=2 Apr 21 04:39:49.390985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.390901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerDied","Data":"5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51"} Apr 21 04:39:49.465669 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.465641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:49.581799 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:49.581750 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs"] Apr 21 04:39:49.584957 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:39:49.584926 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0884b0f_74d7_4eb8_a7c5_8d928b65aff8.slice/crio-53661a8ed9dcb7230d15981987f042e81617a9d093be8b772e3b58a9fd086d2a WatchSource:0}: Error finding container 53661a8ed9dcb7230d15981987f042e81617a9d093be8b772e3b58a9fd086d2a: Status 404 returned error can't find the container with id 53661a8ed9dcb7230d15981987f042e81617a9d093be8b772e3b58a9fd086d2a Apr 21 04:39:50.394910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:50.394868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerStarted","Data":"baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb"} Apr 21 04:39:50.394910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:50.394906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerStarted","Data":"53661a8ed9dcb7230d15981987f042e81617a9d093be8b772e3b58a9fd086d2a"} Apr 21 04:39:51.253012 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:51.252969 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.47:8643/healthz\": dial tcp 10.132.0.47:8643: connect: connection refused" Apr 21 04:39:52.301904 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:52.301864 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.47:8080/v2/models/sklearn-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 21 04:39:55.408831 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:55.408798 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerID="baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb" exitCode=0 Apr 21 04:39:55.409242 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:55.408881 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerDied","Data":"baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb"} Apr 21 04:39:56.202271 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.202250 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:39:56.324910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.324875 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " Apr 21 04:39:56.324910 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.324911 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-proxy-tls\") pod \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " Apr 21 04:39:56.325162 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.324935 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kserve-provision-location\") pod \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " Apr 21 04:39:56.325162 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.325024 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc429\" (UniqueName: \"kubernetes.io/projected/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kube-api-access-xc429\") pod \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\" (UID: \"9309b908-6b04-4a34-b6d8-3a1ceb15acc7\") " Apr 21 04:39:56.325280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.325200 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "9309b908-6b04-4a34-b6d8-3a1ceb15acc7" (UID: "9309b908-6b04-4a34-b6d8-3a1ceb15acc7"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:39:56.325280 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.325235 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9309b908-6b04-4a34-b6d8-3a1ceb15acc7" (UID: "9309b908-6b04-4a34-b6d8-3a1ceb15acc7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:39:56.327027 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.327001 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9309b908-6b04-4a34-b6d8-3a1ceb15acc7" (UID: "9309b908-6b04-4a34-b6d8-3a1ceb15acc7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:39:56.327027 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.327011 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kube-api-access-xc429" (OuterVolumeSpecName: "kube-api-access-xc429") pod "9309b908-6b04-4a34-b6d8-3a1ceb15acc7" (UID: "9309b908-6b04-4a34-b6d8-3a1ceb15acc7"). InnerVolumeSpecName "kube-api-access-xc429". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:39:56.413554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.413528 2575 generic.go:358] "Generic (PLEG): container finished" podID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerID="c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62" exitCode=0 Apr 21 04:39:56.414006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.413591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerDied","Data":"c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62"} Apr 21 04:39:56.414006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.413619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" event={"ID":"9309b908-6b04-4a34-b6d8-3a1ceb15acc7","Type":"ContainerDied","Data":"0c93fa0ee0c515502e8e2ec531595c2aef84b369a90b9f33c8d66fc88ae39bd2"} Apr 21 04:39:56.414006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.413622 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf" Apr 21 04:39:56.414006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.413639 2575 scope.go:117] "RemoveContainer" containerID="5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51" Apr 21 04:39:56.415729 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.415702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerStarted","Data":"71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2"} Apr 21 04:39:56.415852 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.415746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerStarted","Data":"25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152"} Apr 21 04:39:56.416028 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.416002 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:56.416119 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.416038 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:39:56.417523 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.417492 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 04:39:56.422976 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.422958 2575 scope.go:117] "RemoveContainer" containerID="c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62" Apr 21 04:39:56.425458 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.425439 2575 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:39:56.425545 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.425460 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:39:56.425545 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.425471 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:39:56.425545 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.425480 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xc429\" (UniqueName: \"kubernetes.io/projected/9309b908-6b04-4a34-b6d8-3a1ceb15acc7-kube-api-access-xc429\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:39:56.430125 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.430107 2575 scope.go:117] "RemoveContainer" containerID="1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1" Apr 21 04:39:56.436948 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.436924 2575 scope.go:117] "RemoveContainer" containerID="5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51" Apr 21 04:39:56.437240 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:39:56.437212 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51\": container with ID starting with 5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51 not found: ID does not exist" containerID="5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51" Apr 21 04:39:56.437327 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.437241 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51"} err="failed to get container status \"5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51\": rpc error: code = NotFound desc = could not find container \"5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51\": container with ID starting with 5afcc835d300d6cc8e7d74d3f16365632109710d2251e57ea61835a38e269c51 not found: ID does not exist" Apr 21 04:39:56.437327 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.437263 2575 scope.go:117] "RemoveContainer" containerID="c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62" Apr 21 04:39:56.437456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.437376 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" podStartSLOduration=7.437360604 podStartE2EDuration="7.437360604s" podCreationTimestamp="2026-04-21 04:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:56.434795273 +0000 UTC m=+2582.099385362" watchObservedRunningTime="2026-04-21 04:39:56.437360604 +0000 UTC m=+2582.101950692" Apr 21 04:39:56.437538 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:39:56.437521 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62\": container with ID starting with c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62 not found: ID does not exist" containerID="c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62" Apr 21 04:39:56.437596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.437541 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62"} err="failed to get container status \"c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62\": rpc error: code = NotFound desc = could not find container \"c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62\": container with ID starting with c70f3d7cc804b68038b1ab0e300b853f63d631820b1776cd89010b8cebe83e62 not found: ID does not exist" Apr 21 04:39:56.437596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.437556 2575 scope.go:117] "RemoveContainer" containerID="1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1" Apr 21 04:39:56.437938 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:39:56.437902 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1\": container with ID starting with 1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1 not found: ID does not exist" containerID="1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1" Apr 21 04:39:56.438026 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.437945 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1"} err="failed to get container status \"1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1\": rpc error: code = NotFound desc = could not find container \"1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1\": container with ID starting with 1eb3898a4b766b2070f06ce8eebf285e82395164aebd676f2bbdd2ad178c43c1 not found: ID does not exist" Apr 21 04:39:56.447198 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.447177 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf"] Apr 21 04:39:56.452338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:56.452320 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-hkmsf"] Apr 21 04:39:57.027133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:57.027094 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" path="/var/lib/kubelet/pods/9309b908-6b04-4a34-b6d8-3a1ceb15acc7/volumes" Apr 21 04:39:57.419460 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:39:57.419426 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 04:40:02.423655 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:02.423627 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:40:02.424096 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:02.424072 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 21 04:40:12.424465 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:12.424437 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:40:26.188132 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.188028 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-65cd49579f-b8hxs_c0884b0f-74d7-4eb8-a7c5-8d928b65aff8/kserve-container/0.log" Apr 21 04:40:26.334893 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.334859 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs"] Apr 21 04:40:26.335193 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.335167 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kserve-container" containerID="cri-o://25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152" gracePeriod=30 Apr 21 04:40:26.335269 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.335214 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kube-rbac-proxy" containerID="cri-o://71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2" gracePeriod=30 Apr 21 04:40:26.420154 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420121 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh"] Apr 21 04:40:26.420390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420378 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="storage-initializer" Apr 21 04:40:26.420440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420391 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="storage-initializer" Apr 21 04:40:26.420440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420401 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kube-rbac-proxy" Apr 21 04:40:26.420440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420407 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kube-rbac-proxy" Apr 21 04:40:26.420440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420427 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kserve-container" Apr 21 04:40:26.420440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420433 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kserve-container" Apr 21 04:40:26.420624 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420477 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kserve-container" Apr 21 04:40:26.420624 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.420486 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9309b908-6b04-4a34-b6d8-3a1ceb15acc7" containerName="kube-rbac-proxy" Apr 21 04:40:26.423596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.423578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.426154 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.426131 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 21 04:40:26.426273 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.426206 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:40:26.433003 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.432981 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh"] Apr 21 04:40:26.501571 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.501544 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerID="71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2" exitCode=2 Apr 21 04:40:26.501674 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.501614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerDied","Data":"71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2"} Apr 21 04:40:26.550608 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.550586 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zlq9\" (UniqueName: \"kubernetes.io/projected/cad4f829-f300-4b66-bd57-a61daa820dac-kube-api-access-8zlq9\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.550711 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.550635 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cad4f829-f300-4b66-bd57-a61daa820dac-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.550711 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.550692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cad4f829-f300-4b66-bd57-a61daa820dac-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.550817 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.550775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad4f829-f300-4b66-bd57-a61daa820dac-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.651523 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.651501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cad4f829-f300-4b66-bd57-a61daa820dac-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.651638 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.651554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad4f829-f300-4b66-bd57-a61daa820dac-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.651638 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.651602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zlq9\" (UniqueName: \"kubernetes.io/projected/cad4f829-f300-4b66-bd57-a61daa820dac-kube-api-access-8zlq9\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.651752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.651708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cad4f829-f300-4b66-bd57-a61daa820dac-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.652012 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.651991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad4f829-f300-4b66-bd57-a61daa820dac-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.652219 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.652202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cad4f829-f300-4b66-bd57-a61daa820dac-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.653947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.653925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cad4f829-f300-4b66-bd57-a61daa820dac-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.660350 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.660329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zlq9\" (UniqueName: \"kubernetes.io/projected/cad4f829-f300-4b66-bd57-a61daa820dac-kube-api-access-8zlq9\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.734185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.734134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:26.854062 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:26.854015 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh"] Apr 21 04:40:26.856301 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:40:26.856278 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad4f829_f300_4b66_bd57_a61daa820dac.slice/crio-d41be470f28a33be9fe6d66c85006d9dfc443bd8d4c5f73dda461911496d4457 WatchSource:0}: Error finding container d41be470f28a33be9fe6d66c85006d9dfc443bd8d4c5f73dda461911496d4457: Status 404 returned error can't find the container with id d41be470f28a33be9fe6d66c85006d9dfc443bd8d4c5f73dda461911496d4457 Apr 21 04:40:27.481113 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.481085 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:40:27.507034 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.506993 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerID="25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152" exitCode=0 Apr 21 04:40:27.507185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.507077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerDied","Data":"25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152"} Apr 21 04:40:27.507185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.507136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" event={"ID":"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8","Type":"ContainerDied","Data":"53661a8ed9dcb7230d15981987f042e81617a9d093be8b772e3b58a9fd086d2a"} Apr 21 04:40:27.507185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.507087 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" Apr 21 04:40:27.507185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.507161 2575 scope.go:117] "RemoveContainer" containerID="71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2" Apr 21 04:40:27.508544 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.508518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerStarted","Data":"e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36"} Apr 21 04:40:27.508645 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.508553 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerStarted","Data":"d41be470f28a33be9fe6d66c85006d9dfc443bd8d4c5f73dda461911496d4457"} Apr 21 04:40:27.514854 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.514836 2575 scope.go:117] "RemoveContainer" containerID="25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152" Apr 21 04:40:27.521895 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.521876 2575 scope.go:117] "RemoveContainer" containerID="baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb" Apr 21 04:40:27.528429 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.528405 2575 scope.go:117] "RemoveContainer" containerID="71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2" Apr 21 04:40:27.529384 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:40:27.529335 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2\": container with ID starting with 71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2 not found: ID does not exist" containerID="71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2" Apr 21 04:40:27.529517 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.529389 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2"} err="failed to get container status \"71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2\": rpc error: code = NotFound desc = could not find container \"71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2\": container with ID starting with 71b120772c6f0f1ad1e0fdfc50715a61142cfa99ec1380b75fc8016bafe8eca2 not found: ID does not exist" Apr 21 04:40:27.529517 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.529414 2575 scope.go:117] "RemoveContainer" containerID="25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152" Apr 21 04:40:27.529709 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:40:27.529690 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152\": container with ID starting with 25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152 not found: ID does not exist" containerID="25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152" Apr 21 04:40:27.529782 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.529715 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152"} err="failed to get container status \"25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152\": rpc error: code = NotFound desc = could not find container \"25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152\": container with ID starting with 25326614dc63ef684e5e84862d41d34be5c451b44c144713e7cb884323653152 not found: ID does not exist" Apr 21 04:40:27.529782 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.529732 2575 scope.go:117] "RemoveContainer" containerID="baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb" Apr 21 04:40:27.530030 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:40:27.529998 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb\": container with ID starting with baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb not found: ID does not exist" containerID="baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb" Apr 21 04:40:27.530087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.530038 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb"} err="failed to get container status \"baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb\": rpc error: code = NotFound desc = could not find container \"baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb\": container with ID starting with baaf6cfa0138b6c9271ffeb936bda28fed48f275728fa2b8a6b5c230695a14fb not found: ID does not exist" Apr 21 04:40:27.559295 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.559271 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4hqb\" (UniqueName: \"kubernetes.io/projected/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kube-api-access-p4hqb\") pod \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " Apr 21 04:40:27.559378 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.559299 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-proxy-tls\") pod \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " Apr 21 04:40:27.559378 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.559351 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kserve-provision-location\") pod \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " Apr 21 04:40:27.559456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.559383 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\" (UID: \"c0884b0f-74d7-4eb8-a7c5-8d928b65aff8\") " Apr 21 04:40:27.559736 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.559710 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" (UID: "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:40:27.561195 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.561172 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kube-api-access-p4hqb" (OuterVolumeSpecName: "kube-api-access-p4hqb") pod "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" (UID: "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8"). InnerVolumeSpecName "kube-api-access-p4hqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:40:27.561601 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.561584 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" (UID: "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:40:27.585022 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.584992 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" (UID: "c0884b0f-74d7-4eb8-a7c5-8d928b65aff8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:40:27.660794 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.660748 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:40:27.660879 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.660796 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4hqb\" (UniqueName: \"kubernetes.io/projected/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kube-api-access-p4hqb\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:40:27.660879 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.660810 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:40:27.660879 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.660820 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:40:27.829370 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.829349 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs"] Apr 21 04:40:27.832733 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:27.832712 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs"] Apr 21 04:40:28.420912 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:28.420872 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-b8hxs" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.48:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 21 04:40:29.027419 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:29.027384 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" path="/var/lib/kubelet/pods/c0884b0f-74d7-4eb8-a7c5-8d928b65aff8/volumes" Apr 21 04:40:31.521189 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:31.521155 2575 generic.go:358] "Generic (PLEG): container finished" podID="cad4f829-f300-4b66-bd57-a61daa820dac" containerID="e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36" exitCode=0 Apr 21 04:40:31.521658 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:31.521231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerDied","Data":"e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36"} Apr 21 04:40:32.525581 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:32.525548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerStarted","Data":"7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74"} Apr 21 04:40:32.525581 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:32.525586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerStarted","Data":"a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c"} Apr 21 04:40:32.526030 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:32.525804 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:32.526030 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:32.525861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:40:32.544633 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:32.544582 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" podStartSLOduration=6.544564284 podStartE2EDuration="6.544564284s" podCreationTimestamp="2026-04-21 04:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:40:32.543048955 +0000 UTC m=+2618.207639043" watchObservedRunningTime="2026-04-21 04:40:32.544564284 +0000 UTC m=+2618.209154371" Apr 21 04:40:38.534633 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:40:38.534603 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:41:08.538753 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:08.538690 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 04:41:18.537407 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:18.537378 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:41:26.528221 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.528192 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh"] Apr 21 04:41:26.528625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.528509 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kserve-container" containerID="cri-o://a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c" gracePeriod=30 Apr 21 04:41:26.528625 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.528562 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kube-rbac-proxy" containerID="cri-o://7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74" gracePeriod=30 Apr 21 04:41:26.603228 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603198 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn"] Apr 21 04:41:26.603487 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603474 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kserve-container" Apr 21 04:41:26.603530 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603489 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kserve-container" Apr 21 04:41:26.603530 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603508 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="storage-initializer" Apr 21 04:41:26.603530 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603515 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="storage-initializer" Apr 21 04:41:26.603530 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603523 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kube-rbac-proxy" Apr 21 04:41:26.603530 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603529 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kube-rbac-proxy" Apr 21 04:41:26.603691 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603577 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kserve-container" Apr 21 04:41:26.603691 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.603585 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0884b0f-74d7-4eb8-a7c5-8d928b65aff8" containerName="kube-rbac-proxy" Apr 21 04:41:26.606656 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.606638 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.609566 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.609542 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 21 04:41:26.609904 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.609888 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 21 04:41:26.619053 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.619034 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn"] Apr 21 04:41:26.672242 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.672218 2575 generic.go:358] "Generic (PLEG): container finished" podID="cad4f829-f300-4b66-bd57-a61daa820dac" containerID="7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74" exitCode=2 Apr 21 04:41:26.672351 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.672252 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerDied","Data":"7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74"} Apr 21 04:41:26.776019 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.775995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.776124 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.776032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.776124 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.776055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.776202 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.776144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2pxg\" (UniqueName: \"kubernetes.io/projected/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kube-api-access-w2pxg\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.877313 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.877292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.877409 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.877328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.877409 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.877353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.877409 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.877373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2pxg\" (UniqueName: \"kubernetes.io/projected/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kube-api-access-w2pxg\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.877563 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:41:26.877457 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 21 04:41:26.877563 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:41:26.877539 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls podName:5b878c0e-21b9-45ef-8470-6867a0fc6bdc nodeName:}" failed. No retries permitted until 2026-04-21 04:41:27.377522762 +0000 UTC m=+2673.042112830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls") pod "isvc-sklearn-v2-predictor-69755fbb9-8jdwn" (UID: "5b878c0e-21b9-45ef-8470-6867a0fc6bdc") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 21 04:41:26.877815 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.877789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.877977 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.877960 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:26.887959 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:26.887942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2pxg\" (UniqueName: \"kubernetes.io/projected/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kube-api-access-w2pxg\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:27.381522 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:27.381493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:27.383919 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:27.383900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-8jdwn\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:27.516117 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:27.516085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:27.632861 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:27.632789 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn"] Apr 21 04:41:27.634956 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:41:27.634928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b878c0e_21b9_45ef_8470_6867a0fc6bdc.slice/crio-0072eea2c0a429e9488c2a6b0e62794788bfbdabb3be02213b8fbaafe31ff8cf WatchSource:0}: Error finding container 0072eea2c0a429e9488c2a6b0e62794788bfbdabb3be02213b8fbaafe31ff8cf: Status 404 returned error can't find the container with id 0072eea2c0a429e9488c2a6b0e62794788bfbdabb3be02213b8fbaafe31ff8cf Apr 21 04:41:27.675325 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:27.675293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerStarted","Data":"0072eea2c0a429e9488c2a6b0e62794788bfbdabb3be02213b8fbaafe31ff8cf"} Apr 21 04:41:28.530342 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:28.530296 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.49:8643/healthz\": dial tcp 10.132.0.49:8643: connect: connection refused" Apr 21 04:41:28.679538 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:28.679503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerStarted","Data":"a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a"} Apr 21 04:41:29.545586 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:29.545530 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.49:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 21 04:41:31.688949 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:31.688916 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerID="a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a" exitCode=0 Apr 21 04:41:31.689352 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:31.688976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerDied","Data":"a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a"} Apr 21 04:41:32.694349 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:32.694314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerStarted","Data":"a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718"} Apr 21 04:41:32.694349 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:32.694356 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerStarted","Data":"ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62"} Apr 21 04:41:32.694795 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:32.694687 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:32.694795 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:32.694787 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:32.696069 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:32.696044 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:41:32.712183 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:32.712144 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podStartSLOduration=6.71213099 podStartE2EDuration="6.71213099s" podCreationTimestamp="2026-04-21 04:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:41:32.710661984 +0000 UTC m=+2678.375252081" watchObservedRunningTime="2026-04-21 04:41:32.71213099 +0000 UTC m=+2678.376721070" Apr 21 04:41:32.969412 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:32.969392 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:41:33.123996 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.123967 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cad4f829-f300-4b66-bd57-a61daa820dac-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"cad4f829-f300-4b66-bd57-a61daa820dac\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " Apr 21 04:41:33.124153 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.124013 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cad4f829-f300-4b66-bd57-a61daa820dac-proxy-tls\") pod \"cad4f829-f300-4b66-bd57-a61daa820dac\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " Apr 21 04:41:33.124153 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.124096 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad4f829-f300-4b66-bd57-a61daa820dac-kserve-provision-location\") pod \"cad4f829-f300-4b66-bd57-a61daa820dac\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " Apr 21 04:41:33.124153 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.124125 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zlq9\" (UniqueName: \"kubernetes.io/projected/cad4f829-f300-4b66-bd57-a61daa820dac-kube-api-access-8zlq9\") pod \"cad4f829-f300-4b66-bd57-a61daa820dac\" (UID: \"cad4f829-f300-4b66-bd57-a61daa820dac\") " Apr 21 04:41:33.124346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.124325 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cad4f829-f300-4b66-bd57-a61daa820dac-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "cad4f829-f300-4b66-bd57-a61daa820dac" (UID: "cad4f829-f300-4b66-bd57-a61daa820dac"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:41:33.124488 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.124460 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad4f829-f300-4b66-bd57-a61daa820dac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cad4f829-f300-4b66-bd57-a61daa820dac" (UID: "cad4f829-f300-4b66-bd57-a61daa820dac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:41:33.126175 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.126155 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad4f829-f300-4b66-bd57-a61daa820dac-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cad4f829-f300-4b66-bd57-a61daa820dac" (UID: "cad4f829-f300-4b66-bd57-a61daa820dac"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:41:33.126256 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.126226 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad4f829-f300-4b66-bd57-a61daa820dac-kube-api-access-8zlq9" (OuterVolumeSpecName: "kube-api-access-8zlq9") pod "cad4f829-f300-4b66-bd57-a61daa820dac" (UID: "cad4f829-f300-4b66-bd57-a61daa820dac"). InnerVolumeSpecName "kube-api-access-8zlq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:41:33.225310 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.225246 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad4f829-f300-4b66-bd57-a61daa820dac-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:41:33.225310 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.225270 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zlq9\" (UniqueName: \"kubernetes.io/projected/cad4f829-f300-4b66-bd57-a61daa820dac-kube-api-access-8zlq9\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:41:33.225310 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.225282 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cad4f829-f300-4b66-bd57-a61daa820dac-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:41:33.225310 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.225292 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cad4f829-f300-4b66-bd57-a61daa820dac-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:41:33.699083 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.699052 2575 generic.go:358] "Generic (PLEG): container finished" podID="cad4f829-f300-4b66-bd57-a61daa820dac" containerID="a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c" exitCode=0 Apr 21 04:41:33.699510 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.699129 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" Apr 21 04:41:33.699510 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.699139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerDied","Data":"a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c"} Apr 21 04:41:33.699510 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.699175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh" event={"ID":"cad4f829-f300-4b66-bd57-a61daa820dac","Type":"ContainerDied","Data":"d41be470f28a33be9fe6d66c85006d9dfc443bd8d4c5f73dda461911496d4457"} Apr 21 04:41:33.699510 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.699193 2575 scope.go:117] "RemoveContainer" containerID="7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74" Apr 21 04:41:33.699746 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.699707 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:41:33.707512 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.707498 2575 scope.go:117] "RemoveContainer" containerID="a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c" Apr 21 04:41:33.714563 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.714546 2575 scope.go:117] "RemoveContainer" containerID="e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36" Apr 21 04:41:33.719502 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.719478 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh"] Apr 21 04:41:33.721392 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.721371 2575 scope.go:117] "RemoveContainer" containerID="7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74" Apr 21 04:41:33.721734 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:41:33.721702 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74\": container with ID starting with 7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74 not found: ID does not exist" containerID="7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74" Apr 21 04:41:33.721868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.721742 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74"} err="failed to get container status \"7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74\": rpc error: code = NotFound desc = could not find container \"7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74\": container with ID starting with 7ce2908d3acf4e33d012394bf52c97cc546993c1e9e3f9742f8f6c782d325b74 not found: ID does not exist" Apr 21 04:41:33.721868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.721785 2575 scope.go:117] "RemoveContainer" containerID="a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c" Apr 21 04:41:33.722269 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:41:33.722239 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c\": container with ID starting with a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c not found: ID does not exist" containerID="a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c" Apr 21 04:41:33.722375 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.722278 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c"} err="failed to get container status \"a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c\": rpc error: code = NotFound desc = could not find container \"a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c\": container with ID starting with a08b2923b4232a87647ea29a9e8414fabed3c26fed39e4bb718d0a5ac050627c not found: ID does not exist" Apr 21 04:41:33.722375 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.722303 2575 scope.go:117] "RemoveContainer" containerID="e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36" Apr 21 04:41:33.722572 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:41:33.722552 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36\": container with ID starting with e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36 not found: ID does not exist" containerID="e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36" Apr 21 04:41:33.722632 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.722578 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36"} err="failed to get container status \"e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36\": rpc error: code = NotFound desc = could not find container \"e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36\": container with ID starting with e65d7ef4d8ad79a95a2ad5b92af28d078a4a4e082aa29d1ff6dbcf9e99b75a36 not found: ID does not exist" Apr 21 04:41:33.723834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:33.723816 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2jxvh"] Apr 21 04:41:35.027228 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:35.027197 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" path="/var/lib/kubelet/pods/cad4f829-f300-4b66-bd57-a61daa820dac/volumes" Apr 21 04:41:38.703664 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:38.703637 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:41:38.704154 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:38.704127 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:41:48.704861 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:48.704745 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:41:58.704795 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:41:58.704739 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:42:08.704301 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:08.704262 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:42:18.705118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:18.705079 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:42:28.704402 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:28.704366 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:42:38.704843 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:38.704814 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:42:46.815401 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.815364 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn"] Apr 21 04:42:46.815973 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.815793 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" containerID="cri-o://ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62" gracePeriod=30 Apr 21 04:42:46.815973 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.815872 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kube-rbac-proxy" containerID="cri-o://a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718" gracePeriod=30 Apr 21 04:42:46.917628 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.917597 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc"] Apr 21 04:42:46.917931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.917917 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kube-rbac-proxy" Apr 21 04:42:46.917985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.917934 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kube-rbac-proxy" Apr 21 04:42:46.917985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.917948 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="storage-initializer" Apr 21 04:42:46.917985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.917954 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="storage-initializer" Apr 21 04:42:46.917985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.917963 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kserve-container" Apr 21 04:42:46.917985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.917968 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kserve-container" Apr 21 04:42:46.918141 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.918018 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kserve-container" Apr 21 04:42:46.918141 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.918026 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cad4f829-f300-4b66-bd57-a61daa820dac" containerName="kube-rbac-proxy" Apr 21 04:42:46.921155 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.921136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:46.923966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.923944 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 21 04:42:46.924219 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.924200 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 21 04:42:46.931318 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:46.931298 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc"] Apr 21 04:42:47.006397 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.006367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.006519 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.006420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/663fcda7-68d5-4f11-aabe-091e83cd9f7b-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.006519 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.006446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.006519 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.006482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2qb\" (UniqueName: \"kubernetes.io/projected/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kube-api-access-4d2qb\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.107432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.107372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.107432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.107409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/663fcda7-68d5-4f11-aabe-091e83cd9f7b-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.107432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.107428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.107639 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.107451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2qb\" (UniqueName: \"kubernetes.io/projected/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kube-api-access-4d2qb\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.107639 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:42:47.107559 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 21 04:42:47.107639 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:42:47.107629 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls podName:663fcda7-68d5-4f11-aabe-091e83cd9f7b nodeName:}" failed. No retries permitted until 2026-04-21 04:42:47.607610394 +0000 UTC m=+2753.272200480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" (UID: "663fcda7-68d5-4f11-aabe-091e83cd9f7b") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 21 04:42:47.107852 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.107833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.108081 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.108064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/663fcda7-68d5-4f11-aabe-091e83cd9f7b-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.119779 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.119740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2qb\" (UniqueName: \"kubernetes.io/projected/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kube-api-access-4d2qb\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.610983 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.610935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.613285 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.613255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.831289 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.831259 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:47.901808 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.901780 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerID="a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718" exitCode=2 Apr 21 04:42:47.901949 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.901834 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerDied","Data":"a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718"} Apr 21 04:42:47.952082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.952055 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc"] Apr 21 04:42:47.954890 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:42:47.954854 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663fcda7_68d5_4f11_aabe_091e83cd9f7b.slice/crio-ac996a319546c963b254317896ed06bec8b5b52edc4b659d5134c93d8fc6abb3 WatchSource:0}: Error finding container ac996a319546c963b254317896ed06bec8b5b52edc4b659d5134c93d8fc6abb3: Status 404 returned error can't find the container with id ac996a319546c963b254317896ed06bec8b5b52edc4b659d5134c93d8fc6abb3 Apr 21 04:42:47.956671 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:47.956654 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:42:48.700541 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:48.700503 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.50:8643/healthz\": dial tcp 10.132.0.50:8643: connect: connection refused" Apr 21 04:42:48.704903 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:48.704876 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 21 04:42:48.905834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:48.905795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerStarted","Data":"87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e"} Apr 21 04:42:48.905834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:48.905832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerStarted","Data":"ac996a319546c963b254317896ed06bec8b5b52edc4b659d5134c93d8fc6abb3"} Apr 21 04:42:50.852188 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.852167 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:42:50.912514 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.912426 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerID="ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62" exitCode=0 Apr 21 04:42:50.912654 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.912509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerDied","Data":"ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62"} Apr 21 04:42:50.912654 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.912528 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" Apr 21 04:42:50.912654 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.912551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn" event={"ID":"5b878c0e-21b9-45ef-8470-6867a0fc6bdc","Type":"ContainerDied","Data":"0072eea2c0a429e9488c2a6b0e62794788bfbdabb3be02213b8fbaafe31ff8cf"} Apr 21 04:42:50.912654 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.912567 2575 scope.go:117] "RemoveContainer" containerID="a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718" Apr 21 04:42:50.919968 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.919946 2575 scope.go:117] "RemoveContainer" containerID="ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62" Apr 21 04:42:50.926438 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.926423 2575 scope.go:117] "RemoveContainer" containerID="a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a" Apr 21 04:42:50.935046 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.935021 2575 scope.go:117] "RemoveContainer" containerID="a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718" Apr 21 04:42:50.935271 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:42:50.935249 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718\": container with ID starting with a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718 not found: ID does not exist" containerID="a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718" Apr 21 04:42:50.935340 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.935280 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718"} err="failed to get container status \"a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718\": rpc error: code = NotFound desc = could not find container \"a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718\": container with ID starting with a936f89662e5e39778ebdea0cbad83122e6b5d8e4d9284757cfaf4bab5c30718 not found: ID does not exist" Apr 21 04:42:50.935340 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.935297 2575 scope.go:117] "RemoveContainer" containerID="ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62" Apr 21 04:42:50.935512 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:42:50.935494 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62\": container with ID starting with ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62 not found: ID does not exist" containerID="ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62" Apr 21 04:42:50.935569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.935515 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62"} err="failed to get container status \"ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62\": rpc error: code = NotFound desc = could not find container \"ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62\": container with ID starting with ea0b6e038d2d1622213b04bfd62d1bb8aeeba907a9b90cd2518e48fa4306be62 not found: ID does not exist" Apr 21 04:42:50.935569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.935528 2575 scope.go:117] "RemoveContainer" containerID="a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a" Apr 21 04:42:50.935735 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:42:50.935717 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a\": container with ID starting with a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a not found: ID does not exist" containerID="a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a" Apr 21 04:42:50.935793 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.935743 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a"} err="failed to get container status \"a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a\": rpc error: code = NotFound desc = could not find container \"a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a\": container with ID starting with a2b712e767df0246b6b0fa546696b0ab4574cf3e624cf8520491da979f57032a not found: ID does not exist" Apr 21 04:42:50.937938 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.937923 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " Apr 21 04:42:50.937993 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.937957 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls\") pod \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " Apr 21 04:42:50.937993 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.937996 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2pxg\" (UniqueName: \"kubernetes.io/projected/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kube-api-access-w2pxg\") pod \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " Apr 21 04:42:50.938082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.938036 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kserve-provision-location\") pod \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\" (UID: \"5b878c0e-21b9-45ef-8470-6867a0fc6bdc\") " Apr 21 04:42:50.938381 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.938275 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "5b878c0e-21b9-45ef-8470-6867a0fc6bdc" (UID: "5b878c0e-21b9-45ef-8470-6867a0fc6bdc"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:50.938456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.938404 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b878c0e-21b9-45ef-8470-6867a0fc6bdc" (UID: "5b878c0e-21b9-45ef-8470-6867a0fc6bdc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:42:50.939947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.939929 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b878c0e-21b9-45ef-8470-6867a0fc6bdc" (UID: "5b878c0e-21b9-45ef-8470-6867a0fc6bdc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:50.940046 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:50.940026 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kube-api-access-w2pxg" (OuterVolumeSpecName: "kube-api-access-w2pxg") pod "5b878c0e-21b9-45ef-8470-6867a0fc6bdc" (UID: "5b878c0e-21b9-45ef-8470-6867a0fc6bdc"). InnerVolumeSpecName "kube-api-access-w2pxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:42:51.038728 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:51.038705 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2pxg\" (UniqueName: \"kubernetes.io/projected/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kube-api-access-w2pxg\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:42:51.038728 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:51.038730 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:42:51.038880 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:51.038740 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:42:51.038880 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:51.038750 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b878c0e-21b9-45ef-8470-6867a0fc6bdc-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:42:51.228282 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:51.228201 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn"] Apr 21 04:42:51.235203 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:51.235176 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-8jdwn"] Apr 21 04:42:52.920113 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:52.920081 2575 generic.go:358] "Generic (PLEG): container finished" podID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerID="87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e" exitCode=0 Apr 21 04:42:52.920454 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:52.920142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerDied","Data":"87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e"} Apr 21 04:42:53.027540 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:53.027517 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" path="/var/lib/kubelet/pods/5b878c0e-21b9-45ef-8470-6867a0fc6bdc/volumes" Apr 21 04:42:53.924241 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:53.924206 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerStarted","Data":"4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28"} Apr 21 04:42:53.924586 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:53.924250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerStarted","Data":"ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e"} Apr 21 04:42:53.924586 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:53.924454 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:53.946150 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:53.946103 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podStartSLOduration=7.9460839199999995 podStartE2EDuration="7.94608392s" podCreationTimestamp="2026-04-21 04:42:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:42:53.944949015 +0000 UTC m=+2759.609539099" watchObservedRunningTime="2026-04-21 04:42:53.94608392 +0000 UTC m=+2759.610674007" Apr 21 04:42:54.927553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:54.927528 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:42:54.928623 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:54.928588 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:42:55.930433 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:42:55.930384 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:43:00.934466 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:43:00.934438 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:43:00.935030 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:43:00.935006 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:43:10.935446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:43:10.935404 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:43:20.935677 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:43:20.935583 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:43:30.935533 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:43:30.935490 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:43:40.935622 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:43:40.935576 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:43:50.935475 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:43:50.935434 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 21 04:44:00.935496 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:00.935467 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:44:06.991388 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:06.991361 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc"] Apr 21 04:44:06.991795 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:06.991679 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" containerID="cri-o://ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e" gracePeriod=30 Apr 21 04:44:06.991795 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:06.991733 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kube-rbac-proxy" containerID="cri-o://4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28" gracePeriod=30 Apr 21 04:44:07.081218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081191 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp"] Apr 21 04:44:07.081483 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081470 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kube-rbac-proxy" Apr 21 04:44:07.081527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081485 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kube-rbac-proxy" Apr 21 04:44:07.081527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081495 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" Apr 21 04:44:07.081527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081501 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" Apr 21 04:44:07.081527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081515 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="storage-initializer" Apr 21 04:44:07.081527 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081520 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="storage-initializer" Apr 21 04:44:07.081683 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081568 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kserve-container" Apr 21 04:44:07.081683 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.081577 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b878c0e-21b9-45ef-8470-6867a0fc6bdc" containerName="kube-rbac-proxy" Apr 21 04:44:07.084362 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.084344 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.086992 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.086970 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 21 04:44:07.087082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.086971 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 21 04:44:07.093547 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.093524 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp"] Apr 21 04:44:07.114988 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.114967 2575 generic.go:358] "Generic (PLEG): container finished" podID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerID="4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28" exitCode=2 Apr 21 04:44:07.115089 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.115033 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerDied","Data":"4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28"} Apr 21 04:44:07.181213 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.181189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40a5ee9e-f79e-4547-83b3-df0930d53f38-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.181316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.181218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ddvn\" (UniqueName: \"kubernetes.io/projected/40a5ee9e-f79e-4547-83b3-df0930d53f38-kube-api-access-2ddvn\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.181316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.181243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40a5ee9e-f79e-4547-83b3-df0930d53f38-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.181432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.181395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40a5ee9e-f79e-4547-83b3-df0930d53f38-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.282640 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.282572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40a5ee9e-f79e-4547-83b3-df0930d53f38-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.282640 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.282634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40a5ee9e-f79e-4547-83b3-df0930d53f38-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.282813 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.282663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ddvn\" (UniqueName: \"kubernetes.io/projected/40a5ee9e-f79e-4547-83b3-df0930d53f38-kube-api-access-2ddvn\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.282813 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.282702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40a5ee9e-f79e-4547-83b3-df0930d53f38-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.283020 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.283001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40a5ee9e-f79e-4547-83b3-df0930d53f38-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.283484 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.283462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40a5ee9e-f79e-4547-83b3-df0930d53f38-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.285025 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.285005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40a5ee9e-f79e-4547-83b3-df0930d53f38-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.292015 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.291990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ddvn\" (UniqueName: \"kubernetes.io/projected/40a5ee9e-f79e-4547-83b3-df0930d53f38-kube-api-access-2ddvn\") pod \"isvc-tensorflow-predictor-6756f669d7-hqrvp\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.395003 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.394975 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:07.509467 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:07.509444 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp"] Apr 21 04:44:08.119961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:08.119914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerStarted","Data":"4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd"} Apr 21 04:44:08.119961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:08.119956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerStarted","Data":"adb1cb9e8504ab10c6e419024718f3990b37f94b006f66559a44b7292844e48f"} Apr 21 04:44:10.828220 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.828198 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:44:10.905616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.905545 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d2qb\" (UniqueName: \"kubernetes.io/projected/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kube-api-access-4d2qb\") pod \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " Apr 21 04:44:10.905616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.905605 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls\") pod \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " Apr 21 04:44:10.905822 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.905637 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kserve-provision-location\") pod \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " Apr 21 04:44:10.905822 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.905667 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/663fcda7-68d5-4f11-aabe-091e83cd9f7b-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\" (UID: \"663fcda7-68d5-4f11-aabe-091e83cd9f7b\") " Apr 21 04:44:10.906037 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.906012 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "663fcda7-68d5-4f11-aabe-091e83cd9f7b" (UID: "663fcda7-68d5-4f11-aabe-091e83cd9f7b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:10.906167 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.906067 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/663fcda7-68d5-4f11-aabe-091e83cd9f7b-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "663fcda7-68d5-4f11-aabe-091e83cd9f7b" (UID: "663fcda7-68d5-4f11-aabe-091e83cd9f7b"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:44:10.907573 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.907546 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "663fcda7-68d5-4f11-aabe-091e83cd9f7b" (UID: "663fcda7-68d5-4f11-aabe-091e83cd9f7b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:44:10.907656 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:10.907617 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kube-api-access-4d2qb" (OuterVolumeSpecName: "kube-api-access-4d2qb") pod "663fcda7-68d5-4f11-aabe-091e83cd9f7b" (UID: "663fcda7-68d5-4f11-aabe-091e83cd9f7b"). InnerVolumeSpecName "kube-api-access-4d2qb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:44:11.006868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.006725 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4d2qb\" (UniqueName: \"kubernetes.io/projected/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kube-api-access-4d2qb\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:44:11.006868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.006752 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/663fcda7-68d5-4f11-aabe-091e83cd9f7b-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:44:11.006868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.006802 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/663fcda7-68d5-4f11-aabe-091e83cd9f7b-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:44:11.006868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.006816 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/663fcda7-68d5-4f11-aabe-091e83cd9f7b-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:44:11.130090 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.130062 2575 generic.go:358] "Generic (PLEG): container finished" podID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerID="ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e" exitCode=0 Apr 21 04:44:11.130217 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.130143 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" Apr 21 04:44:11.130217 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.130142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerDied","Data":"ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e"} Apr 21 04:44:11.130312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.130245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc" event={"ID":"663fcda7-68d5-4f11-aabe-091e83cd9f7b","Type":"ContainerDied","Data":"ac996a319546c963b254317896ed06bec8b5b52edc4b659d5134c93d8fc6abb3"} Apr 21 04:44:11.130312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.130267 2575 scope.go:117] "RemoveContainer" containerID="4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28" Apr 21 04:44:11.137708 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.137689 2575 scope.go:117] "RemoveContainer" containerID="ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e" Apr 21 04:44:11.144235 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.144207 2575 scope.go:117] "RemoveContainer" containerID="87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e" Apr 21 04:44:11.147597 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.147566 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc"] Apr 21 04:44:11.150787 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.150748 2575 scope.go:117] "RemoveContainer" containerID="4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28" Apr 21 04:44:11.151135 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:44:11.151089 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28\": container with ID starting with 4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28 not found: ID does not exist" containerID="4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28" Apr 21 04:44:11.151253 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.151146 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28"} err="failed to get container status \"4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28\": rpc error: code = NotFound desc = could not find container \"4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28\": container with ID starting with 4ea1cddcec002648d3c67eca7dff127a570e24d65876c9b8e6cf749d4d360e28 not found: ID does not exist" Apr 21 04:44:11.151253 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.151176 2575 scope.go:117] "RemoveContainer" containerID="ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e" Apr 21 04:44:11.151522 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:44:11.151502 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e\": container with ID starting with ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e not found: ID does not exist" containerID="ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e" Apr 21 04:44:11.151637 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.151527 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e"} err="failed to get container status \"ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e\": rpc error: code = NotFound desc = could not find container \"ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e\": container with ID starting with ce22e8d397730b00494e186fc1eb1b4390dd513608b7cce0bd55affea298058e not found: ID does not exist" Apr 21 04:44:11.151637 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.151544 2575 scope.go:117] "RemoveContainer" containerID="87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e" Apr 21 04:44:11.151838 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:44:11.151822 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e\": container with ID starting with 87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e not found: ID does not exist" containerID="87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e" Apr 21 04:44:11.151901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.151845 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e"} err="failed to get container status \"87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e\": rpc error: code = NotFound desc = could not find container \"87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e\": container with ID starting with 87a39a6b59ce00dc68a04cdf950c6199953056638dba7bb9ddd9a502f8bd222e not found: ID does not exist" Apr 21 04:44:11.152811 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:11.152792 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-fw5xc"] Apr 21 04:44:13.027630 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:13.027598 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" path="/var/lib/kubelet/pods/663fcda7-68d5-4f11-aabe-091e83cd9f7b/volumes" Apr 21 04:44:13.142106 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:13.142073 2575 generic.go:358] "Generic (PLEG): container finished" podID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerID="4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd" exitCode=0 Apr 21 04:44:13.142257 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:13.142151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerDied","Data":"4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd"} Apr 21 04:44:18.162248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:18.162213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerStarted","Data":"c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315"} Apr 21 04:44:18.162248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:18.162252 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerStarted","Data":"53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b"} Apr 21 04:44:18.162852 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:18.162458 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:18.182022 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:18.181981 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podStartSLOduration=7.189885669 podStartE2EDuration="11.181969504s" podCreationTimestamp="2026-04-21 04:44:07 +0000 UTC" firstStartedPulling="2026-04-21 04:44:13.14339386 +0000 UTC m=+2838.807983927" lastFinishedPulling="2026-04-21 04:44:17.135477698 +0000 UTC m=+2842.800067762" observedRunningTime="2026-04-21 04:44:18.180665266 +0000 UTC m=+2843.845255353" watchObservedRunningTime="2026-04-21 04:44:18.181969504 +0000 UTC m=+2843.846559589" Apr 21 04:44:19.164775 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:19.164733 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:19.165951 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:19.165924 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 21 04:44:20.169131 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:20.169078 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 21 04:44:25.174260 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:25.174229 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:25.174876 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:25.174850 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 21 04:44:35.175913 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:35.175881 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:44:46.592267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.592183 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp"] Apr 21 04:44:46.592667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.592636 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kserve-container" containerID="cri-o://53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b" gracePeriod=30 Apr 21 04:44:46.592782 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.592713 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" containerID="cri-o://c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315" gracePeriod=30 Apr 21 04:44:46.663365 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663337 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj"] Apr 21 04:44:46.663601 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663589 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kube-rbac-proxy" Apr 21 04:44:46.663652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663602 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kube-rbac-proxy" Apr 21 04:44:46.663652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663614 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" Apr 21 04:44:46.663652 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663620 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" Apr 21 04:44:46.663778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663657 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="storage-initializer" Apr 21 04:44:46.663778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663663 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="storage-initializer" Apr 21 04:44:46.663778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663744 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kube-rbac-proxy" Apr 21 04:44:46.663778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.663752 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="663fcda7-68d5-4f11-aabe-091e83cd9f7b" containerName="kserve-container" Apr 21 04:44:46.666673 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.666658 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.669340 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.669319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 21 04:44:46.669656 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.669640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:44:46.677484 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.677459 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj"] Apr 21 04:44:46.762245 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.762216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69c61af4-a692-48b1-9b7b-e0459454377b-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.762360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.762273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69c61af4-a692-48b1-9b7b-e0459454377b-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.762360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.762326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69c61af4-a692-48b1-9b7b-e0459454377b-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.762444 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.762366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7xl9\" (UniqueName: \"kubernetes.io/projected/69c61af4-a692-48b1-9b7b-e0459454377b-kube-api-access-j7xl9\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.863248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.863192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69c61af4-a692-48b1-9b7b-e0459454377b-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.863248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.863228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69c61af4-a692-48b1-9b7b-e0459454377b-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.863390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.863261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69c61af4-a692-48b1-9b7b-e0459454377b-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.863390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.863295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7xl9\" (UniqueName: \"kubernetes.io/projected/69c61af4-a692-48b1-9b7b-e0459454377b-kube-api-access-j7xl9\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.863570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.863551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69c61af4-a692-48b1-9b7b-e0459454377b-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.863901 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.863880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69c61af4-a692-48b1-9b7b-e0459454377b-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.865486 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.865466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69c61af4-a692-48b1-9b7b-e0459454377b-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.872015 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.871997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7xl9\" (UniqueName: \"kubernetes.io/projected/69c61af4-a692-48b1-9b7b-e0459454377b-kube-api-access-j7xl9\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-cknmj\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:46.976087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:46.976065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:47.095823 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:47.095793 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj"] Apr 21 04:44:47.098124 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:44:47.098097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c61af4_a692_48b1_9b7b_e0459454377b.slice/crio-554e5b31a6994a84486cf3e09281e583910981f88abad70bdb3d6bd538642525 WatchSource:0}: Error finding container 554e5b31a6994a84486cf3e09281e583910981f88abad70bdb3d6bd538642525: Status 404 returned error can't find the container with id 554e5b31a6994a84486cf3e09281e583910981f88abad70bdb3d6bd538642525 Apr 21 04:44:47.245872 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:47.245844 2575 generic.go:358] "Generic (PLEG): container finished" podID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerID="c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315" exitCode=2 Apr 21 04:44:47.246042 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:47.245924 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerDied","Data":"c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315"} Apr 21 04:44:47.247304 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:47.247278 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerStarted","Data":"acad19e0583f8444c60d1139751a3ad57d8f7d8aa41e09574ee99835e0b244e4"} Apr 21 04:44:47.247393 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:47.247313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerStarted","Data":"554e5b31a6994a84486cf3e09281e583910981f88abad70bdb3d6bd538642525"} Apr 21 04:44:50.169663 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:50.169619 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 21 04:44:52.262226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:52.262190 2575 generic.go:358] "Generic (PLEG): container finished" podID="69c61af4-a692-48b1-9b7b-e0459454377b" containerID="acad19e0583f8444c60d1139751a3ad57d8f7d8aa41e09574ee99835e0b244e4" exitCode=0 Apr 21 04:44:52.262588 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:52.262262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerDied","Data":"acad19e0583f8444c60d1139751a3ad57d8f7d8aa41e09574ee99835e0b244e4"} Apr 21 04:44:53.266896 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:53.266862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerStarted","Data":"b6edade88b0d8bd2f9056d9e4b958664446c1a0ba52f0f6406d388649431f959"} Apr 21 04:44:53.266896 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:53.266899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerStarted","Data":"a746485ac54fd393b807cde5514a5d33c0bb6af23e74a4fbb68f03ec1471fb45"} Apr 21 04:44:53.267379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:53.267175 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:53.267379 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:53.267292 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:53.268413 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:53.268386 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 21 04:44:53.285519 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:53.285479 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podStartSLOduration=7.285469464 podStartE2EDuration="7.285469464s" podCreationTimestamp="2026-04-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:44:53.28491565 +0000 UTC m=+2878.949505736" watchObservedRunningTime="2026-04-21 04:44:53.285469464 +0000 UTC m=+2878.950059549" Apr 21 04:44:54.270401 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:54.270352 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 21 04:44:55.170001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:55.169965 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 21 04:44:59.274222 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:59.274188 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:44:59.274668 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:44:59.274641 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 21 04:45:00.170321 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:00.170276 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 21 04:45:00.170505 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:00.170408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:45:05.169555 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:05.169515 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 21 04:45:09.275417 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:09.275381 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:45:10.169644 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:10.169608 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 21 04:45:15.169297 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:15.169259 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 21 04:45:17.231085 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.231064 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:45:17.332205 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.332174 2575 generic.go:358] "Generic (PLEG): container finished" podID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerID="53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b" exitCode=137 Apr 21 04:45:17.332361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.332244 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerDied","Data":"53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b"} Apr 21 04:45:17.332361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.332254 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" Apr 21 04:45:17.332361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.332278 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp" event={"ID":"40a5ee9e-f79e-4547-83b3-df0930d53f38","Type":"ContainerDied","Data":"adb1cb9e8504ab10c6e419024718f3990b37f94b006f66559a44b7292844e48f"} Apr 21 04:45:17.332361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.332293 2575 scope.go:117] "RemoveContainer" containerID="c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315" Apr 21 04:45:17.339800 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.339767 2575 scope.go:117] "RemoveContainer" containerID="53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b" Apr 21 04:45:17.348288 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.348266 2575 scope.go:117] "RemoveContainer" containerID="4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd" Apr 21 04:45:17.354709 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.354689 2575 scope.go:117] "RemoveContainer" containerID="c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315" Apr 21 04:45:17.354953 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:45:17.354935 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315\": container with ID starting with c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315 not found: ID does not exist" containerID="c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315" Apr 21 04:45:17.355000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.354962 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315"} err="failed to get container status \"c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315\": rpc error: code = NotFound desc = could not find container \"c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315\": container with ID starting with c1444d54b0896b4dd36096ee81fd68bf94d0e78f55f95b0b61aca1b81b9da315 not found: ID does not exist" Apr 21 04:45:17.355000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.354979 2575 scope.go:117] "RemoveContainer" containerID="53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b" Apr 21 04:45:17.355217 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:45:17.355200 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b\": container with ID starting with 53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b not found: ID does not exist" containerID="53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b" Apr 21 04:45:17.355257 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.355224 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b"} err="failed to get container status \"53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b\": rpc error: code = NotFound desc = could not find container \"53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b\": container with ID starting with 53047d7b186a0b481db52c3f0303df5bc62b1257e74ba4ee26dd44259ff6b12b not found: ID does not exist" Apr 21 04:45:17.355257 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.355238 2575 scope.go:117] "RemoveContainer" containerID="4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd" Apr 21 04:45:17.355465 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:45:17.355447 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd\": container with ID starting with 4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd not found: ID does not exist" containerID="4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd" Apr 21 04:45:17.355502 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.355472 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd"} err="failed to get container status \"4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd\": rpc error: code = NotFound desc = could not find container \"4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd\": container with ID starting with 4c7a8605c7d900e855a8fb152dd2239b85479e57cbe4d42b8c98f76523994ffd not found: ID does not exist" Apr 21 04:45:17.389471 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.389452 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40a5ee9e-f79e-4547-83b3-df0930d53f38-kserve-provision-location\") pod \"40a5ee9e-f79e-4547-83b3-df0930d53f38\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " Apr 21 04:45:17.389570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.389485 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40a5ee9e-f79e-4547-83b3-df0930d53f38-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"40a5ee9e-f79e-4547-83b3-df0930d53f38\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " Apr 21 04:45:17.389570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.389504 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ddvn\" (UniqueName: \"kubernetes.io/projected/40a5ee9e-f79e-4547-83b3-df0930d53f38-kube-api-access-2ddvn\") pod \"40a5ee9e-f79e-4547-83b3-df0930d53f38\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " Apr 21 04:45:17.389570 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.389538 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40a5ee9e-f79e-4547-83b3-df0930d53f38-proxy-tls\") pod \"40a5ee9e-f79e-4547-83b3-df0930d53f38\" (UID: \"40a5ee9e-f79e-4547-83b3-df0930d53f38\") " Apr 21 04:45:17.389846 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.389825 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a5ee9e-f79e-4547-83b3-df0930d53f38-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "40a5ee9e-f79e-4547-83b3-df0930d53f38" (UID: "40a5ee9e-f79e-4547-83b3-df0930d53f38"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:45:17.391411 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.391387 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40a5ee9e-f79e-4547-83b3-df0930d53f38-kube-api-access-2ddvn" (OuterVolumeSpecName: "kube-api-access-2ddvn") pod "40a5ee9e-f79e-4547-83b3-df0930d53f38" (UID: "40a5ee9e-f79e-4547-83b3-df0930d53f38"). InnerVolumeSpecName "kube-api-access-2ddvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:45:17.391536 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.391520 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a5ee9e-f79e-4547-83b3-df0930d53f38-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "40a5ee9e-f79e-4547-83b3-df0930d53f38" (UID: "40a5ee9e-f79e-4547-83b3-df0930d53f38"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:45:17.400557 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.400535 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40a5ee9e-f79e-4547-83b3-df0930d53f38-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40a5ee9e-f79e-4547-83b3-df0930d53f38" (UID: "40a5ee9e-f79e-4547-83b3-df0930d53f38"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:17.490001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.489979 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40a5ee9e-f79e-4547-83b3-df0930d53f38-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:17.490001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.490001 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ddvn\" (UniqueName: \"kubernetes.io/projected/40a5ee9e-f79e-4547-83b3-df0930d53f38-kube-api-access-2ddvn\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:17.490114 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.490014 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40a5ee9e-f79e-4547-83b3-df0930d53f38-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:17.490114 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.490024 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40a5ee9e-f79e-4547-83b3-df0930d53f38-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:17.654376 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.654347 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp"] Apr 21 04:45:17.657617 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:17.657594 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-hqrvp"] Apr 21 04:45:19.028803 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:19.028753 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" path="/var/lib/kubelet/pods/40a5ee9e-f79e-4547-83b3-df0930d53f38/volumes" Apr 21 04:45:27.079449 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.079417 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj"] Apr 21 04:45:27.079918 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.079853 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kserve-container" containerID="cri-o://a746485ac54fd393b807cde5514a5d33c0bb6af23e74a4fbb68f03ec1471fb45" gracePeriod=30 Apr 21 04:45:27.079918 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.079875 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" containerID="cri-o://b6edade88b0d8bd2f9056d9e4b958664446c1a0ba52f0f6406d388649431f959" gracePeriod=30 Apr 21 04:45:27.156001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.155970 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7"] Apr 21 04:45:27.156247 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156235 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="storage-initializer" Apr 21 04:45:27.156247 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156247 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="storage-initializer" Apr 21 04:45:27.156337 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156256 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kserve-container" Apr 21 04:45:27.156337 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156261 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kserve-container" Apr 21 04:45:27.156337 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156273 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" Apr 21 04:45:27.156337 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156278 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" Apr 21 04:45:27.156337 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156320 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kube-rbac-proxy" Apr 21 04:45:27.156337 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.156330 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="40a5ee9e-f79e-4547-83b3-df0930d53f38" containerName="kserve-container" Apr 21 04:45:27.160831 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.160814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.163079 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.163053 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 21 04:45:27.163252 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.163235 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 21 04:45:27.167578 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.167554 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7"] Apr 21 04:45:27.258057 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.258032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/279d0745-08be-40f4-b564-9ed1b2234600-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.258165 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.258061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tntbw\" (UniqueName: \"kubernetes.io/projected/279d0745-08be-40f4-b564-9ed1b2234600-kube-api-access-tntbw\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.258165 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.258110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/279d0745-08be-40f4-b564-9ed1b2234600-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.258165 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.258138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/279d0745-08be-40f4-b564-9ed1b2234600-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.358439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.358369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/279d0745-08be-40f4-b564-9ed1b2234600-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.358439 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.358410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tntbw\" (UniqueName: \"kubernetes.io/projected/279d0745-08be-40f4-b564-9ed1b2234600-kube-api-access-tntbw\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.358615 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.358473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/279d0745-08be-40f4-b564-9ed1b2234600-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.358615 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.358514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/279d0745-08be-40f4-b564-9ed1b2234600-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.358984 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.358961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/279d0745-08be-40f4-b564-9ed1b2234600-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.359157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.359139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/279d0745-08be-40f4-b564-9ed1b2234600-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.361073 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.361053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/279d0745-08be-40f4-b564-9ed1b2234600-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.362802 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.362772 2575 generic.go:358] "Generic (PLEG): container finished" podID="69c61af4-a692-48b1-9b7b-e0459454377b" containerID="b6edade88b0d8bd2f9056d9e4b958664446c1a0ba52f0f6406d388649431f959" exitCode=2 Apr 21 04:45:27.362896 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.362833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerDied","Data":"b6edade88b0d8bd2f9056d9e4b958664446c1a0ba52f0f6406d388649431f959"} Apr 21 04:45:27.368118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.368096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tntbw\" (UniqueName: \"kubernetes.io/projected/279d0745-08be-40f4-b564-9ed1b2234600-kube-api-access-tntbw\") pod \"isvc-triton-predictor-84bb65d94b-tnng7\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.472070 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.472044 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:45:27.584446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:27.584419 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7"] Apr 21 04:45:27.586735 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:45:27.586705 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod279d0745_08be_40f4_b564_9ed1b2234600.slice/crio-e4dd83941ec49d6c364d8fe65d08cee3a470172178f6baacf738a14d7d33e0f7 WatchSource:0}: Error finding container e4dd83941ec49d6c364d8fe65d08cee3a470172178f6baacf738a14d7d33e0f7: Status 404 returned error can't find the container with id e4dd83941ec49d6c364d8fe65d08cee3a470172178f6baacf738a14d7d33e0f7 Apr 21 04:45:28.367914 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:28.367880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerStarted","Data":"e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa"} Apr 21 04:45:28.367914 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:28.367915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerStarted","Data":"e4dd83941ec49d6c364d8fe65d08cee3a470172178f6baacf738a14d7d33e0f7"} Apr 21 04:45:29.271062 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:29.271018 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 21 04:45:32.380416 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:32.380387 2575 generic.go:358] "Generic (PLEG): container finished" podID="279d0745-08be-40f4-b564-9ed1b2234600" containerID="e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa" exitCode=0 Apr 21 04:45:32.380774 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:32.380459 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerDied","Data":"e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa"} Apr 21 04:45:34.271531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:34.271065 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 21 04:45:39.271873 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:39.271524 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 21 04:45:39.271873 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:39.271642 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:45:44.271361 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:44.270891 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 21 04:45:49.271006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:49.270950 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 21 04:45:54.271482 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:54.271427 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 21 04:45:57.477055 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.477013 2575 generic.go:358] "Generic (PLEG): container finished" podID="69c61af4-a692-48b1-9b7b-e0459454377b" containerID="a746485ac54fd393b807cde5514a5d33c0bb6af23e74a4fbb68f03ec1471fb45" exitCode=137 Apr 21 04:45:57.477567 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.477094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerDied","Data":"a746485ac54fd393b807cde5514a5d33c0bb6af23e74a4fbb68f03ec1471fb45"} Apr 21 04:45:57.772829 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.772546 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:45:57.804926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.804884 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69c61af4-a692-48b1-9b7b-e0459454377b-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"69c61af4-a692-48b1-9b7b-e0459454377b\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " Apr 21 04:45:57.804926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.804935 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7xl9\" (UniqueName: \"kubernetes.io/projected/69c61af4-a692-48b1-9b7b-e0459454377b-kube-api-access-j7xl9\") pod \"69c61af4-a692-48b1-9b7b-e0459454377b\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " Apr 21 04:45:57.804926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.804972 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69c61af4-a692-48b1-9b7b-e0459454377b-kserve-provision-location\") pod \"69c61af4-a692-48b1-9b7b-e0459454377b\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " Apr 21 04:45:57.805293 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.805004 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69c61af4-a692-48b1-9b7b-e0459454377b-proxy-tls\") pod \"69c61af4-a692-48b1-9b7b-e0459454377b\" (UID: \"69c61af4-a692-48b1-9b7b-e0459454377b\") " Apr 21 04:45:57.805791 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.805699 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69c61af4-a692-48b1-9b7b-e0459454377b-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "69c61af4-a692-48b1-9b7b-e0459454377b" (UID: "69c61af4-a692-48b1-9b7b-e0459454377b"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:45:57.809334 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.809308 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c61af4-a692-48b1-9b7b-e0459454377b-kube-api-access-j7xl9" (OuterVolumeSpecName: "kube-api-access-j7xl9") pod "69c61af4-a692-48b1-9b7b-e0459454377b" (UID: "69c61af4-a692-48b1-9b7b-e0459454377b"). InnerVolumeSpecName "kube-api-access-j7xl9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:45:57.811339 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.811305 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c61af4-a692-48b1-9b7b-e0459454377b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "69c61af4-a692-48b1-9b7b-e0459454377b" (UID: "69c61af4-a692-48b1-9b7b-e0459454377b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:45:57.815228 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.815145 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c61af4-a692-48b1-9b7b-e0459454377b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "69c61af4-a692-48b1-9b7b-e0459454377b" (UID: "69c61af4-a692-48b1-9b7b-e0459454377b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:57.905853 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.905823 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69c61af4-a692-48b1-9b7b-e0459454377b-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:57.905853 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.905848 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j7xl9\" (UniqueName: \"kubernetes.io/projected/69c61af4-a692-48b1-9b7b-e0459454377b-kube-api-access-j7xl9\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:57.905853 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.905859 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69c61af4-a692-48b1-9b7b-e0459454377b-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:57.906059 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:57.905889 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69c61af4-a692-48b1-9b7b-e0459454377b-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:45:58.483259 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:58.483160 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" event={"ID":"69c61af4-a692-48b1-9b7b-e0459454377b","Type":"ContainerDied","Data":"554e5b31a6994a84486cf3e09281e583910981f88abad70bdb3d6bd538642525"} Apr 21 04:45:58.483259 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:58.483205 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj" Apr 21 04:45:58.483259 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:58.483214 2575 scope.go:117] "RemoveContainer" containerID="b6edade88b0d8bd2f9056d9e4b958664446c1a0ba52f0f6406d388649431f959" Apr 21 04:45:58.494647 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:58.494619 2575 scope.go:117] "RemoveContainer" containerID="a746485ac54fd393b807cde5514a5d33c0bb6af23e74a4fbb68f03ec1471fb45" Apr 21 04:45:58.504341 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:58.504316 2575 scope.go:117] "RemoveContainer" containerID="acad19e0583f8444c60d1139751a3ad57d8f7d8aa41e09574ee99835e0b244e4" Apr 21 04:45:58.510310 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:58.510285 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj"] Apr 21 04:45:58.513166 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:58.513133 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-cknmj"] Apr 21 04:45:59.029951 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:45:59.029918 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" path="/var/lib/kubelet/pods/69c61af4-a692-48b1-9b7b-e0459454377b/volumes" Apr 21 04:47:27.748824 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:27.748785 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerStarted","Data":"efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c"} Apr 21 04:47:27.748824 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:27.748828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerStarted","Data":"ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e"} Apr 21 04:47:27.749263 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:27.749046 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:47:27.768587 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:27.768541 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" podStartSLOduration=6.45333261 podStartE2EDuration="2m0.768525947s" podCreationTimestamp="2026-04-21 04:45:27 +0000 UTC" firstStartedPulling="2026-04-21 04:45:32.381495374 +0000 UTC m=+2918.046085439" lastFinishedPulling="2026-04-21 04:47:26.696688712 +0000 UTC m=+3032.361278776" observedRunningTime="2026-04-21 04:47:27.767117113 +0000 UTC m=+3033.431707210" watchObservedRunningTime="2026-04-21 04:47:27.768525947 +0000 UTC m=+3033.433116032" Apr 21 04:47:28.751926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:28.751888 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:47:28.753102 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:28.753075 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 21 04:47:29.754301 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:29.754264 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 21 04:47:34.758308 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:34.758277 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:47:34.759079 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:34.759061 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:47:38.721455 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.721359 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7"] Apr 21 04:47:38.803229 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.721655 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kserve-container" containerID="cri-o://ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e" gracePeriod=30 Apr 21 04:47:38.803229 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.721686 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kube-rbac-proxy" containerID="cri-o://efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c" gracePeriod=30 Apr 21 04:47:38.864711 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.864685 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg"] Apr 21 04:47:38.864981 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.864967 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" Apr 21 04:47:38.865033 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.864983 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" Apr 21 04:47:38.865033 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.864994 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="storage-initializer" Apr 21 04:47:38.865033 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.865000 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="storage-initializer" Apr 21 04:47:38.865033 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.865016 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kserve-container" Apr 21 04:47:38.865033 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.865022 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kserve-container" Apr 21 04:47:38.865191 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.865068 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kube-rbac-proxy" Apr 21 04:47:38.865191 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.865077 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="69c61af4-a692-48b1-9b7b-e0459454377b" containerName="kserve-container" Apr 21 04:47:38.880748 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.880726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:38.883629 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.883606 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 21 04:47:38.883987 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.883659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 21 04:47:38.888670 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:38.888650 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg"] Apr 21 04:47:39.015149 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.015080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/228589af-8d0b-4f3a-8afc-c5fa7489a94d-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.015149 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.015130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.015315 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.015154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.015315 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.015182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqln\" (UniqueName: \"kubernetes.io/projected/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kube-api-access-mrqln\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.115505 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.115476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/228589af-8d0b-4f3a-8afc-c5fa7489a94d-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.115720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.115526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.115720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.115550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.115720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.115589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqln\" (UniqueName: \"kubernetes.io/projected/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kube-api-access-mrqln\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.115720 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:47:39.115632 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-predictor-serving-cert: secret "isvc-xgboost-predictor-serving-cert" not found Apr 21 04:47:39.115720 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:47:39.115705 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls podName:228589af-8d0b-4f3a-8afc-c5fa7489a94d nodeName:}" failed. No retries permitted until 2026-04-21 04:47:39.615683876 +0000 UTC m=+3045.280273955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls") pod "isvc-xgboost-predictor-8689c4cfcc-hvcbg" (UID: "228589af-8d0b-4f3a-8afc-c5fa7489a94d") : secret "isvc-xgboost-predictor-serving-cert" not found Apr 21 04:47:39.116078 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.116054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.116218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.116203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/228589af-8d0b-4f3a-8afc-c5fa7489a94d-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.124776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.124735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqln\" (UniqueName: \"kubernetes.io/projected/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kube-api-access-mrqln\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.618993 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.618959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.621317 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.621287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-hvcbg\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.755411 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.755375 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.54:8643/healthz\": dial tcp 10.132.0.54:8643: connect: connection refused" Apr 21 04:47:39.782812 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.782782 2575 generic.go:358] "Generic (PLEG): container finished" podID="279d0745-08be-40f4-b564-9ed1b2234600" containerID="efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c" exitCode=2 Apr 21 04:47:39.782946 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.782854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerDied","Data":"efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c"} Apr 21 04:47:39.792085 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.792068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:47:39.909348 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:39.909165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg"] Apr 21 04:47:39.911744 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:47:39.911717 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228589af_8d0b_4f3a_8afc_c5fa7489a94d.slice/crio-8ec49b818b059988bfd99804487b77c75f07f78a248d64b0f3792a51a7f593ff WatchSource:0}: Error finding container 8ec49b818b059988bfd99804487b77c75f07f78a248d64b0f3792a51a7f593ff: Status 404 returned error can't find the container with id 8ec49b818b059988bfd99804487b77c75f07f78a248d64b0f3792a51a7f593ff Apr 21 04:47:40.787192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:40.787153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerStarted","Data":"f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69"} Apr 21 04:47:40.787192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:40.787191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerStarted","Data":"8ec49b818b059988bfd99804487b77c75f07f78a248d64b0f3792a51a7f593ff"} Apr 21 04:47:41.477234 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.477210 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:47:41.635366 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.635343 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/279d0745-08be-40f4-b564-9ed1b2234600-isvc-triton-kube-rbac-proxy-sar-config\") pod \"279d0745-08be-40f4-b564-9ed1b2234600\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " Apr 21 04:47:41.635517 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.635381 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/279d0745-08be-40f4-b564-9ed1b2234600-proxy-tls\") pod \"279d0745-08be-40f4-b564-9ed1b2234600\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " Apr 21 04:47:41.635517 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.635431 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tntbw\" (UniqueName: \"kubernetes.io/projected/279d0745-08be-40f4-b564-9ed1b2234600-kube-api-access-tntbw\") pod \"279d0745-08be-40f4-b564-9ed1b2234600\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " Apr 21 04:47:41.635517 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.635469 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/279d0745-08be-40f4-b564-9ed1b2234600-kserve-provision-location\") pod \"279d0745-08be-40f4-b564-9ed1b2234600\" (UID: \"279d0745-08be-40f4-b564-9ed1b2234600\") " Apr 21 04:47:41.635793 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.635744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279d0745-08be-40f4-b564-9ed1b2234600-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "279d0745-08be-40f4-b564-9ed1b2234600" (UID: "279d0745-08be-40f4-b564-9ed1b2234600"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:47:41.635943 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.635918 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/279d0745-08be-40f4-b564-9ed1b2234600-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "279d0745-08be-40f4-b564-9ed1b2234600" (UID: "279d0745-08be-40f4-b564-9ed1b2234600"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:47:41.637451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.637427 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279d0745-08be-40f4-b564-9ed1b2234600-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "279d0745-08be-40f4-b564-9ed1b2234600" (UID: "279d0745-08be-40f4-b564-9ed1b2234600"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:47:41.637451 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.637439 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279d0745-08be-40f4-b564-9ed1b2234600-kube-api-access-tntbw" (OuterVolumeSpecName: "kube-api-access-tntbw") pod "279d0745-08be-40f4-b564-9ed1b2234600" (UID: "279d0745-08be-40f4-b564-9ed1b2234600"). InnerVolumeSpecName "kube-api-access-tntbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:47:41.736104 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.736084 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tntbw\" (UniqueName: \"kubernetes.io/projected/279d0745-08be-40f4-b564-9ed1b2234600-kube-api-access-tntbw\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:47:41.736104 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.736104 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/279d0745-08be-40f4-b564-9ed1b2234600-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:47:41.736237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.736114 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/279d0745-08be-40f4-b564-9ed1b2234600-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:47:41.736237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.736124 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/279d0745-08be-40f4-b564-9ed1b2234600-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:47:41.791338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.791312 2575 generic.go:358] "Generic (PLEG): container finished" podID="279d0745-08be-40f4-b564-9ed1b2234600" containerID="ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e" exitCode=0 Apr 21 04:47:41.791631 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.791385 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" Apr 21 04:47:41.791631 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.791396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerDied","Data":"ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e"} Apr 21 04:47:41.791631 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.791435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7" event={"ID":"279d0745-08be-40f4-b564-9ed1b2234600","Type":"ContainerDied","Data":"e4dd83941ec49d6c364d8fe65d08cee3a470172178f6baacf738a14d7d33e0f7"} Apr 21 04:47:41.791631 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.791450 2575 scope.go:117] "RemoveContainer" containerID="efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c" Apr 21 04:47:41.799928 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.799816 2575 scope.go:117] "RemoveContainer" containerID="ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e" Apr 21 04:47:41.806626 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.806597 2575 scope.go:117] "RemoveContainer" containerID="e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa" Apr 21 04:47:41.812819 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.812802 2575 scope.go:117] "RemoveContainer" containerID="efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c" Apr 21 04:47:41.813054 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:47:41.813034 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c\": container with ID starting with efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c not found: ID does not exist" containerID="efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c" Apr 21 04:47:41.813113 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.813063 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c"} err="failed to get container status \"efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c\": rpc error: code = NotFound desc = could not find container \"efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c\": container with ID starting with efcbdc735d9b595a466473b9a34fe376e4b68a1ba0e544ed968062b27ab7f72c not found: ID does not exist" Apr 21 04:47:41.813113 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.813079 2575 scope.go:117] "RemoveContainer" containerID="ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e" Apr 21 04:47:41.813314 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:47:41.813294 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e\": container with ID starting with ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e not found: ID does not exist" containerID="ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e" Apr 21 04:47:41.813360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.813319 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e"} err="failed to get container status \"ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e\": rpc error: code = NotFound desc = could not find container \"ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e\": container with ID starting with ef19e354bc8303723890503d7512c9cbe4eaa41266383627987d8b555661c02e not found: ID does not exist" Apr 21 04:47:41.813360 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.813335 2575 scope.go:117] "RemoveContainer" containerID="e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa" Apr 21 04:47:41.813549 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:47:41.813532 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa\": container with ID starting with e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa not found: ID does not exist" containerID="e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa" Apr 21 04:47:41.813603 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.813556 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa"} err="failed to get container status \"e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa\": rpc error: code = NotFound desc = could not find container \"e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa\": container with ID starting with e97c694071abacbd69181c3cf321c3da6807234cadb8801d4a17ceaf2dcb04fa not found: ID does not exist" Apr 21 04:47:41.816833 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.816752 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7"] Apr 21 04:47:41.823777 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:41.823743 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-tnng7"] Apr 21 04:47:43.028032 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:43.027997 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279d0745-08be-40f4-b564-9ed1b2234600" path="/var/lib/kubelet/pods/279d0745-08be-40f4-b564-9ed1b2234600/volumes" Apr 21 04:47:44.801256 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:44.801223 2575 generic.go:358] "Generic (PLEG): container finished" podID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerID="f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69" exitCode=0 Apr 21 04:47:44.801619 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:47:44.801294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerDied","Data":"f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69"} Apr 21 04:48:05.220886 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:05.220867 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:48:05.867356 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:05.867324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerStarted","Data":"40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d"} Apr 21 04:48:05.867531 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:05.867363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerStarted","Data":"3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90"} Apr 21 04:48:05.867596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:05.867562 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:48:05.888730 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:05.888675 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podStartSLOduration=7.570388355 podStartE2EDuration="27.888661446s" podCreationTimestamp="2026-04-21 04:47:38 +0000 UTC" firstStartedPulling="2026-04-21 04:47:44.80244837 +0000 UTC m=+3050.467038434" lastFinishedPulling="2026-04-21 04:48:05.120721461 +0000 UTC m=+3070.785311525" observedRunningTime="2026-04-21 04:48:05.886407842 +0000 UTC m=+3071.550997927" watchObservedRunningTime="2026-04-21 04:48:05.888661446 +0000 UTC m=+3071.553251532" Apr 21 04:48:06.869769 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:06.869737 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:48:06.870930 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:06.870904 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 21 04:48:07.872654 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:07.872612 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 21 04:48:12.877139 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:12.877113 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:48:12.877638 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:12.877615 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 21 04:48:22.877985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:22.877932 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 21 04:48:32.877657 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:32.877620 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 21 04:48:42.877720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:42.877680 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 21 04:48:52.877750 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:48:52.877705 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 21 04:49:02.878941 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:02.878903 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:49:08.956438 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:08.956403 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg"] Apr 21 04:49:08.956926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:08.956674 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" containerID="cri-o://3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90" gracePeriod=30 Apr 21 04:49:08.956926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:08.956750 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kube-rbac-proxy" containerID="cri-o://40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d" gracePeriod=30 Apr 21 04:49:09.097814 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.097785 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9"] Apr 21 04:49:09.098088 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098076 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kserve-container" Apr 21 04:49:09.098133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098091 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kserve-container" Apr 21 04:49:09.098133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098101 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="storage-initializer" Apr 21 04:49:09.098133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098106 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="storage-initializer" Apr 21 04:49:09.098133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098117 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kube-rbac-proxy" Apr 21 04:49:09.098133 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098122 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kube-rbac-proxy" Apr 21 04:49:09.098287 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098168 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kserve-container" Apr 21 04:49:09.098287 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.098178 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="279d0745-08be-40f4-b564-9ed1b2234600" containerName="kube-rbac-proxy" Apr 21 04:49:09.100985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.100971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.103579 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.103562 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 21 04:49:09.103657 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.103575 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 21 04:49:09.114828 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.114804 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9"] Apr 21 04:49:09.261808 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.261704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6957\" (UniqueName: \"kubernetes.io/projected/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kube-api-access-h6957\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.261808 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.261750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.262005 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.261864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.262005 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.261904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.362863 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.362831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6957\" (UniqueName: \"kubernetes.io/projected/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kube-api-access-h6957\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.363013 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.362874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.363013 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.362927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.363013 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.362966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.363364 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.363338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.363614 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.363594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.365372 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.365352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.371201 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.371177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6957\" (UniqueName: \"kubernetes.io/projected/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kube-api-access-h6957\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.415500 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.415465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:09.532547 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:09.532524 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9"] Apr 21 04:49:09.534436 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:49:09.534407 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b86e38_40b4_4c37_80f5_e769ffa8c7ff.slice/crio-3ab345796c0c7b9e0de2de1d726167b60dc6b9d3a29a159b3c24233a4c37e975 WatchSource:0}: Error finding container 3ab345796c0c7b9e0de2de1d726167b60dc6b9d3a29a159b3c24233a4c37e975: Status 404 returned error can't find the container with id 3ab345796c0c7b9e0de2de1d726167b60dc6b9d3a29a159b3c24233a4c37e975 Apr 21 04:49:10.041865 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:10.041830 2575 generic.go:358] "Generic (PLEG): container finished" podID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerID="40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d" exitCode=2 Apr 21 04:49:10.042299 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:10.041907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerDied","Data":"40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d"} Apr 21 04:49:10.043329 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:10.043286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerStarted","Data":"98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100"} Apr 21 04:49:10.043329 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:10.043320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerStarted","Data":"3ab345796c0c7b9e0de2de1d726167b60dc6b9d3a29a159b3c24233a4c37e975"} Apr 21 04:49:12.193778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.193744 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:49:12.283524 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.283504 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrqln\" (UniqueName: \"kubernetes.io/projected/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kube-api-access-mrqln\") pod \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " Apr 21 04:49:12.283659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.283536 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls\") pod \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " Apr 21 04:49:12.283659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.283580 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/228589af-8d0b-4f3a-8afc-c5fa7489a94d-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " Apr 21 04:49:12.283659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.283624 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kserve-provision-location\") pod \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\" (UID: \"228589af-8d0b-4f3a-8afc-c5fa7489a94d\") " Apr 21 04:49:12.283965 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.283932 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228589af-8d0b-4f3a-8afc-c5fa7489a94d-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "228589af-8d0b-4f3a-8afc-c5fa7489a94d" (UID: "228589af-8d0b-4f3a-8afc-c5fa7489a94d"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:49:12.284063 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.283975 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "228589af-8d0b-4f3a-8afc-c5fa7489a94d" (UID: "228589af-8d0b-4f3a-8afc-c5fa7489a94d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:49:12.285599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.285572 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kube-api-access-mrqln" (OuterVolumeSpecName: "kube-api-access-mrqln") pod "228589af-8d0b-4f3a-8afc-c5fa7489a94d" (UID: "228589af-8d0b-4f3a-8afc-c5fa7489a94d"). InnerVolumeSpecName "kube-api-access-mrqln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:12.285599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.285575 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "228589af-8d0b-4f3a-8afc-c5fa7489a94d" (UID: "228589af-8d0b-4f3a-8afc-c5fa7489a94d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:49:12.384688 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.384665 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/228589af-8d0b-4f3a-8afc-c5fa7489a94d-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:49:12.384688 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.384688 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:49:12.384836 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.384698 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrqln\" (UniqueName: \"kubernetes.io/projected/228589af-8d0b-4f3a-8afc-c5fa7489a94d-kube-api-access-mrqln\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:49:12.384836 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:12.384708 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/228589af-8d0b-4f3a-8afc-c5fa7489a94d-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:49:13.053326 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.053294 2575 generic.go:358] "Generic (PLEG): container finished" podID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerID="3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90" exitCode=0 Apr 21 04:49:13.053454 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.053373 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" Apr 21 04:49:13.053454 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.053373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerDied","Data":"3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90"} Apr 21 04:49:13.053454 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.053411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg" event={"ID":"228589af-8d0b-4f3a-8afc-c5fa7489a94d","Type":"ContainerDied","Data":"8ec49b818b059988bfd99804487b77c75f07f78a248d64b0f3792a51a7f593ff"} Apr 21 04:49:13.053454 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.053426 2575 scope.go:117] "RemoveContainer" containerID="40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d" Apr 21 04:49:13.060864 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.060843 2575 scope.go:117] "RemoveContainer" containerID="3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90" Apr 21 04:49:13.069261 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.068972 2575 scope.go:117] "RemoveContainer" containerID="f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69" Apr 21 04:49:13.070846 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.070824 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg"] Apr 21 04:49:13.075069 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.075048 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-hvcbg"] Apr 21 04:49:13.076241 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.076072 2575 scope.go:117] "RemoveContainer" containerID="40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d" Apr 21 04:49:13.076363 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:49:13.076343 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d\": container with ID starting with 40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d not found: ID does not exist" containerID="40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d" Apr 21 04:49:13.076440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.076374 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d"} err="failed to get container status \"40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d\": rpc error: code = NotFound desc = could not find container \"40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d\": container with ID starting with 40264d9212c64327928ae22519d9876456cf3c7a9ce041c951e22470381aba0d not found: ID does not exist" Apr 21 04:49:13.076440 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.076400 2575 scope.go:117] "RemoveContainer" containerID="3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90" Apr 21 04:49:13.076653 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:49:13.076637 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90\": container with ID starting with 3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90 not found: ID does not exist" containerID="3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90" Apr 21 04:49:13.076690 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.076660 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90"} err="failed to get container status \"3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90\": rpc error: code = NotFound desc = could not find container \"3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90\": container with ID starting with 3a58016048b9abb4238d4553722612406d679b77a9a02833a77b3d2d02ea0b90 not found: ID does not exist" Apr 21 04:49:13.076690 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.076681 2575 scope.go:117] "RemoveContainer" containerID="f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69" Apr 21 04:49:13.076916 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:49:13.076900 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69\": container with ID starting with f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69 not found: ID does not exist" containerID="f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69" Apr 21 04:49:13.076973 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:13.076921 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69"} err="failed to get container status \"f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69\": rpc error: code = NotFound desc = could not find container \"f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69\": container with ID starting with f813c87c3eda62714ba87df44707d655592c65aab215fe24e2df8d1a373edb69 not found: ID does not exist" Apr 21 04:49:14.057237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:14.057199 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerID="98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100" exitCode=0 Apr 21 04:49:14.057639 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:14.057271 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerDied","Data":"98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100"} Apr 21 04:49:15.028238 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:15.028202 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" path="/var/lib/kubelet/pods/228589af-8d0b-4f3a-8afc-c5fa7489a94d/volumes" Apr 21 04:49:15.062367 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:15.062339 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerStarted","Data":"22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23"} Apr 21 04:49:15.062805 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:15.062369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerStarted","Data":"ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6"} Apr 21 04:49:15.062805 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:15.062610 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:15.084479 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:15.084441 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" podStartSLOduration=6.084429693 podStartE2EDuration="6.084429693s" podCreationTimestamp="2026-04-21 04:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:49:15.083174513 +0000 UTC m=+3140.747764598" watchObservedRunningTime="2026-04-21 04:49:15.084429693 +0000 UTC m=+3140.749019779" Apr 21 04:49:16.066055 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:16.066024 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:22.073634 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:22.073603 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:52.078000 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:52.077965 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:49:59.131087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.131053 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9"] Apr 21 04:49:59.131667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.131446 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kserve-container" containerID="cri-o://ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6" gracePeriod=30 Apr 21 04:49:59.131667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.131489 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kube-rbac-proxy" containerID="cri-o://22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23" gracePeriod=30 Apr 21 04:49:59.210664 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.210637 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj"] Apr 21 04:49:59.210932 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.210920 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kube-rbac-proxy" Apr 21 04:49:59.210932 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.210933 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kube-rbac-proxy" Apr 21 04:49:59.211023 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.210947 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" Apr 21 04:49:59.211023 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.210955 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" Apr 21 04:49:59.211023 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.210965 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="storage-initializer" Apr 21 04:49:59.211023 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.210971 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="storage-initializer" Apr 21 04:49:59.211023 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.211015 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kserve-container" Apr 21 04:49:59.211023 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.211022 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228589af-8d0b-4f3a-8afc-c5fa7489a94d" containerName="kube-rbac-proxy" Apr 21 04:49:59.214199 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.214182 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.217034 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.217016 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 21 04:49:59.217213 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.217195 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 21 04:49:59.229577 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.229557 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj"] Apr 21 04:49:59.311890 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.311820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18c3c3c7-021e-4aea-a6af-0e9416731c02-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.311890 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.311858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtmn\" (UniqueName: \"kubernetes.io/projected/18c3c3c7-021e-4aea-a6af-0e9416731c02-kube-api-access-8xtmn\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.312084 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.311916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18c3c3c7-021e-4aea-a6af-0e9416731c02-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.312084 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.311941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18c3c3c7-021e-4aea-a6af-0e9416731c02-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.412937 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.412856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18c3c3c7-021e-4aea-a6af-0e9416731c02-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.412937 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.412894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtmn\" (UniqueName: \"kubernetes.io/projected/18c3c3c7-021e-4aea-a6af-0e9416731c02-kube-api-access-8xtmn\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.412937 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.412928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18c3c3c7-021e-4aea-a6af-0e9416731c02-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.413167 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.412951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18c3c3c7-021e-4aea-a6af-0e9416731c02-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.413450 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.413429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18c3c3c7-021e-4aea-a6af-0e9416731c02-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.413520 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.413503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18c3c3c7-021e-4aea-a6af-0e9416731c02-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.415353 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.415336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18c3c3c7-021e-4aea-a6af-0e9416731c02-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.421294 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.421271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtmn\" (UniqueName: \"kubernetes.io/projected/18c3c3c7-021e-4aea-a6af-0e9416731c02-kube-api-access-8xtmn\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8gsjj\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.524394 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.524355 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:49:59.640265 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:49:59.640232 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj"] Apr 21 04:49:59.642977 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:49:59.642945 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c3c3c7_021e_4aea_a6af_0e9416731c02.slice/crio-d3e01daad9d528b52b5086a8f05210b8c5677e0046d12a1336375209e7e94f6f WatchSource:0}: Error finding container d3e01daad9d528b52b5086a8f05210b8c5677e0046d12a1336375209e7e94f6f: Status 404 returned error can't find the container with id d3e01daad9d528b52b5086a8f05210b8c5677e0046d12a1336375209e7e94f6f Apr 21 04:50:00.185505 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:00.185467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerStarted","Data":"b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200"} Apr 21 04:50:00.185983 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:00.185509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerStarted","Data":"d3e01daad9d528b52b5086a8f05210b8c5677e0046d12a1336375209e7e94f6f"} Apr 21 04:50:00.187382 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:00.187353 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerID="22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23" exitCode=2 Apr 21 04:50:00.187506 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:00.187418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerDied","Data":"22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23"} Apr 21 04:50:02.069444 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:02.069403 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.56:8643/healthz\": dial tcp 10.132.0.56:8643: connect: connection refused" Apr 21 04:50:02.074865 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:02.074833 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.56:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.132.0.56:8080: connect: connection refused" Apr 21 04:50:04.199976 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:04.199934 2575 generic.go:358] "Generic (PLEG): container finished" podID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerID="b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200" exitCode=0 Apr 21 04:50:04.199976 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:04.199974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerDied","Data":"b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200"} Apr 21 04:50:05.075390 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.075364 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:50:05.154853 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.154824 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-proxy-tls\") pod \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " Apr 21 04:50:05.155002 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.154867 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kserve-provision-location\") pod \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " Apr 21 04:50:05.155002 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.154892 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " Apr 21 04:50:05.155127 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.155012 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6957\" (UniqueName: \"kubernetes.io/projected/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kube-api-access-h6957\") pod \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\" (UID: \"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff\") " Apr 21 04:50:05.155188 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.155138 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" (UID: "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:50:05.155248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.155196 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" (UID: "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:50:05.155248 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.155206 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:05.156882 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.156862 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" (UID: "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:50:05.156961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.156921 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kube-api-access-h6957" (OuterVolumeSpecName: "kube-api-access-h6957") pod "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" (UID: "c7b86e38-40b4-4c37-80f5-e769ffa8c7ff"). InnerVolumeSpecName "kube-api-access-h6957". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:50:05.204272 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.204208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerStarted","Data":"7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707"} Apr 21 04:50:05.204272 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.204242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerStarted","Data":"c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04"} Apr 21 04:50:05.204635 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.204470 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:50:05.204635 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.204528 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:50:05.205750 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.205726 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerID="ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6" exitCode=0 Apr 21 04:50:05.205832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.205797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerDied","Data":"ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6"} Apr 21 04:50:05.205832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.205822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" event={"ID":"c7b86e38-40b4-4c37-80f5-e769ffa8c7ff","Type":"ContainerDied","Data":"3ab345796c0c7b9e0de2de1d726167b60dc6b9d3a29a159b3c24233a4c37e975"} Apr 21 04:50:05.205932 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.205837 2575 scope.go:117] "RemoveContainer" containerID="22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23" Apr 21 04:50:05.205932 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.205801 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9" Apr 21 04:50:05.213416 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.213212 2575 scope.go:117] "RemoveContainer" containerID="ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6" Apr 21 04:50:05.219845 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.219829 2575 scope.go:117] "RemoveContainer" containerID="98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100" Apr 21 04:50:05.223244 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.223206 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" podStartSLOduration=6.223193994 podStartE2EDuration="6.223193994s" podCreationTimestamp="2026-04-21 04:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:50:05.221966887 +0000 UTC m=+3190.886556988" watchObservedRunningTime="2026-04-21 04:50:05.223193994 +0000 UTC m=+3190.887784079" Apr 21 04:50:05.226684 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.226669 2575 scope.go:117] "RemoveContainer" containerID="22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23" Apr 21 04:50:05.226918 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:05.226901 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23\": container with ID starting with 22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23 not found: ID does not exist" containerID="22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23" Apr 21 04:50:05.227108 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.226924 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23"} err="failed to get container status \"22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23\": rpc error: code = NotFound desc = could not find container \"22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23\": container with ID starting with 22d2d006e0d1e947cb3aea1f8c1f56fd37329d3456321850f9636694aa04fe23 not found: ID does not exist" Apr 21 04:50:05.227108 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.226940 2575 scope.go:117] "RemoveContainer" containerID="ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6" Apr 21 04:50:05.227205 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:05.227153 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6\": container with ID starting with ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6 not found: ID does not exist" containerID="ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6" Apr 21 04:50:05.227205 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.227166 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6"} err="failed to get container status \"ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6\": rpc error: code = NotFound desc = could not find container \"ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6\": container with ID starting with ce85cfb08b58b841071f050186db79b7e27498c39e6815c688e617266c2fa1b6 not found: ID does not exist" Apr 21 04:50:05.227205 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.227177 2575 scope.go:117] "RemoveContainer" containerID="98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100" Apr 21 04:50:05.227376 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:05.227362 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100\": container with ID starting with 98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100 not found: ID does not exist" containerID="98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100" Apr 21 04:50:05.227414 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.227378 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100"} err="failed to get container status \"98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100\": rpc error: code = NotFound desc = could not find container \"98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100\": container with ID starting with 98a0df42f07e69a8a7a045c693e900bd7d7690c1460a87f1f439ee1672460100 not found: ID does not exist" Apr 21 04:50:05.234156 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.234136 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9"] Apr 21 04:50:05.240518 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.240498 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-znsm9"] Apr 21 04:50:05.256514 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.256493 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:05.256601 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.256518 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:05.256601 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:05.256533 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6957\" (UniqueName: \"kubernetes.io/projected/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff-kube-api-access-h6957\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:07.027222 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:07.027187 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" path="/var/lib/kubelet/pods/c7b86e38-40b4-4c37-80f5-e769ffa8c7ff/volumes" Apr 21 04:50:11.215198 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:11.215169 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:50:41.218811 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:41.218754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:50:49.303696 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.303498 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj"] Apr 21 04:50:49.304137 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.303902 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kserve-container" containerID="cri-o://c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04" gracePeriod=30 Apr 21 04:50:49.304137 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.304066 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kube-rbac-proxy" containerID="cri-o://7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707" gracePeriod=30 Apr 21 04:50:49.388098 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.388056 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj"] Apr 21 04:50:49.388778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.388678 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kserve-container" Apr 21 04:50:49.388778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.388701 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kserve-container" Apr 21 04:50:49.388778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.388743 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="storage-initializer" Apr 21 04:50:49.391593 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.388752 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="storage-initializer" Apr 21 04:50:49.391714 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.391619 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kube-rbac-proxy" Apr 21 04:50:49.391714 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.391639 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kube-rbac-proxy" Apr 21 04:50:49.391933 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.391918 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kube-rbac-proxy" Apr 21 04:50:49.391977 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.391939 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7b86e38-40b4-4c37-80f5-e769ffa8c7ff" containerName="kserve-container" Apr 21 04:50:49.395036 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.395021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.398078 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.398057 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 21 04:50:49.398441 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.398419 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:50:49.398876 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.398856 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj"] Apr 21 04:50:49.463507 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.463481 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.463646 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.463525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5t7f\" (UniqueName: \"kubernetes.io/projected/6d50aab3-3600-4b57-b671-b1756a4ab93b-kube-api-access-k5t7f\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.463646 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.463555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d50aab3-3600-4b57-b671-b1756a4ab93b-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.463646 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.463578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d50aab3-3600-4b57-b671-b1756a4ab93b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.564581 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.564484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.564581 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.564533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5t7f\" (UniqueName: \"kubernetes.io/projected/6d50aab3-3600-4b57-b671-b1756a4ab93b-kube-api-access-k5t7f\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.564581 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.564561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d50aab3-3600-4b57-b671-b1756a4ab93b-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.564581 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.564583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d50aab3-3600-4b57-b671-b1756a4ab93b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.564923 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:49.564656 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 21 04:50:49.564923 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:49.564746 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls podName:6d50aab3-3600-4b57-b671-b1756a4ab93b nodeName:}" failed. No retries permitted until 2026-04-21 04:50:50.064724887 +0000 UTC m=+3235.729314967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-fsmmj" (UID: "6d50aab3-3600-4b57-b671-b1756a4ab93b") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 21 04:50:49.565082 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.565062 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d50aab3-3600-4b57-b671-b1756a4ab93b-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.565218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.565201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d50aab3-3600-4b57-b671-b1756a4ab93b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:49.574954 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:49.574932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5t7f\" (UniqueName: \"kubernetes.io/projected/6d50aab3-3600-4b57-b671-b1756a4ab93b-kube-api-access-k5t7f\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:50.068244 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:50.068212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:50.070530 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:50.070508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-fsmmj\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:50.308324 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:50.308292 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:50.331220 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:50.331137 2575 generic.go:358] "Generic (PLEG): container finished" podID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerID="7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707" exitCode=2 Apr 21 04:50:50.331220 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:50.331199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerDied","Data":"7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707"} Apr 21 04:50:50.429673 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:50.429652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj"] Apr 21 04:50:50.431824 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:50:50.431793 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d50aab3_3600_4b57_b671_b1756a4ab93b.slice/crio-48d0f4f40b5e77ce054e8020593e964fb532d27b8a87d475d94b4d7a0c4dcd68 WatchSource:0}: Error finding container 48d0f4f40b5e77ce054e8020593e964fb532d27b8a87d475d94b4d7a0c4dcd68: Status 404 returned error can't find the container with id 48d0f4f40b5e77ce054e8020593e964fb532d27b8a87d475d94b4d7a0c4dcd68 Apr 21 04:50:51.211370 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:51.211328 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.57:8643/healthz\": dial tcp 10.132.0.57:8643: connect: connection refused" Apr 21 04:50:51.335593 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:51.335557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerStarted","Data":"657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9"} Apr 21 04:50:51.335593 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:51.335597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerStarted","Data":"48d0f4f40b5e77ce054e8020593e964fb532d27b8a87d475d94b4d7a0c4dcd68"} Apr 21 04:50:55.335961 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.335939 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:50:55.347415 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.347389 2575 generic.go:358] "Generic (PLEG): container finished" podID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerID="657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9" exitCode=0 Apr 21 04:50:55.347543 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.347464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerDied","Data":"657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9"} Apr 21 04:50:55.349231 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.349208 2575 generic.go:358] "Generic (PLEG): container finished" podID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerID="c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04" exitCode=0 Apr 21 04:50:55.349338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.349242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerDied","Data":"c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04"} Apr 21 04:50:55.349338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.349275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" event={"ID":"18c3c3c7-021e-4aea-a6af-0e9416731c02","Type":"ContainerDied","Data":"d3e01daad9d528b52b5086a8f05210b8c5677e0046d12a1336375209e7e94f6f"} Apr 21 04:50:55.349338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.349292 2575 scope.go:117] "RemoveContainer" containerID="7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707" Apr 21 04:50:55.349338 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.349293 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj" Apr 21 04:50:55.357271 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.357154 2575 scope.go:117] "RemoveContainer" containerID="c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04" Apr 21 04:50:55.367265 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.367233 2575 scope.go:117] "RemoveContainer" containerID="b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200" Apr 21 04:50:55.379889 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.379866 2575 scope.go:117] "RemoveContainer" containerID="7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707" Apr 21 04:50:55.380154 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:55.380127 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707\": container with ID starting with 7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707 not found: ID does not exist" containerID="7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707" Apr 21 04:50:55.380206 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.380163 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707"} err="failed to get container status \"7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707\": rpc error: code = NotFound desc = could not find container \"7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707\": container with ID starting with 7386948c93d6819c6729225ce1203bd4141005db98dac8c7bb47d506ed2a7707 not found: ID does not exist" Apr 21 04:50:55.380206 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.380180 2575 scope.go:117] "RemoveContainer" containerID="c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04" Apr 21 04:50:55.380414 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:55.380394 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04\": container with ID starting with c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04 not found: ID does not exist" containerID="c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04" Apr 21 04:50:55.380505 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.380419 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04"} err="failed to get container status \"c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04\": rpc error: code = NotFound desc = could not find container \"c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04\": container with ID starting with c8b97bee24826765c206aa917af429ff59a72a53051cec8e4c17f10f83eb5b04 not found: ID does not exist" Apr 21 04:50:55.380505 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.380436 2575 scope.go:117] "RemoveContainer" containerID="b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200" Apr 21 04:50:55.380703 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:50:55.380686 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200\": container with ID starting with b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200 not found: ID does not exist" containerID="b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200" Apr 21 04:50:55.380770 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.380709 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200"} err="failed to get container status \"b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200\": rpc error: code = NotFound desc = could not find container \"b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200\": container with ID starting with b486187b50a8687f469216653e22827ba1796769a35a0381c4a6aa3c6bc14200 not found: ID does not exist" Apr 21 04:50:55.411953 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.411934 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18c3c3c7-021e-4aea-a6af-0e9416731c02-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"18c3c3c7-021e-4aea-a6af-0e9416731c02\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " Apr 21 04:50:55.412036 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.411972 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18c3c3c7-021e-4aea-a6af-0e9416731c02-proxy-tls\") pod \"18c3c3c7-021e-4aea-a6af-0e9416731c02\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " Apr 21 04:50:55.412036 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.412017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18c3c3c7-021e-4aea-a6af-0e9416731c02-kserve-provision-location\") pod \"18c3c3c7-021e-4aea-a6af-0e9416731c02\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " Apr 21 04:50:55.412114 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.412046 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xtmn\" (UniqueName: \"kubernetes.io/projected/18c3c3c7-021e-4aea-a6af-0e9416731c02-kube-api-access-8xtmn\") pod \"18c3c3c7-021e-4aea-a6af-0e9416731c02\" (UID: \"18c3c3c7-021e-4aea-a6af-0e9416731c02\") " Apr 21 04:50:55.412285 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.412262 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c3c3c7-021e-4aea-a6af-0e9416731c02-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "18c3c3c7-021e-4aea-a6af-0e9416731c02" (UID: "18c3c3c7-021e-4aea-a6af-0e9416731c02"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:50:55.412368 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.412291 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c3c3c7-021e-4aea-a6af-0e9416731c02-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "18c3c3c7-021e-4aea-a6af-0e9416731c02" (UID: "18c3c3c7-021e-4aea-a6af-0e9416731c02"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:50:55.413827 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.413809 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c3c3c7-021e-4aea-a6af-0e9416731c02-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "18c3c3c7-021e-4aea-a6af-0e9416731c02" (UID: "18c3c3c7-021e-4aea-a6af-0e9416731c02"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:50:55.413976 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.413957 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c3c3c7-021e-4aea-a6af-0e9416731c02-kube-api-access-8xtmn" (OuterVolumeSpecName: "kube-api-access-8xtmn") pod "18c3c3c7-021e-4aea-a6af-0e9416731c02" (UID: "18c3c3c7-021e-4aea-a6af-0e9416731c02"). InnerVolumeSpecName "kube-api-access-8xtmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:50:55.512585 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.512558 2575 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18c3c3c7-021e-4aea-a6af-0e9416731c02-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:55.512715 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.512589 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18c3c3c7-021e-4aea-a6af-0e9416731c02-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:55.512715 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.512604 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18c3c3c7-021e-4aea-a6af-0e9416731c02-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:55.512715 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.512617 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8xtmn\" (UniqueName: \"kubernetes.io/projected/18c3c3c7-021e-4aea-a6af-0e9416731c02-kube-api-access-8xtmn\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:50:55.671473 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.671442 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj"] Apr 21 04:50:55.675221 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:55.675197 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8gsjj"] Apr 21 04:50:56.353800 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:56.353753 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerStarted","Data":"0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1"} Apr 21 04:50:56.354192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:56.353810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerStarted","Data":"cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a"} Apr 21 04:50:56.354192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:56.354016 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:56.373330 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:56.373289 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podStartSLOduration=7.373277597 podStartE2EDuration="7.373277597s" podCreationTimestamp="2026-04-21 04:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:50:56.371700083 +0000 UTC m=+3242.036290168" watchObservedRunningTime="2026-04-21 04:50:56.373277597 +0000 UTC m=+3242.037867682" Apr 21 04:50:57.027838 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:57.027805 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" path="/var/lib/kubelet/pods/18c3c3c7-021e-4aea-a6af-0e9416731c02/volumes" Apr 21 04:50:57.357208 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:57.357133 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:50:57.358391 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:57.358360 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 21 04:50:58.359747 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:50:58.359711 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 21 04:51:03.364388 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:03.364359 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:51:03.364953 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:03.364929 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 21 04:51:13.365725 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:13.365640 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 21 04:51:23.365279 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:23.365234 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 21 04:51:33.365397 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:33.365359 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 21 04:51:43.365704 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:43.365658 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 21 04:51:53.365923 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:53.365895 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:51:59.482656 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.482623 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj"] Apr 21 04:51:59.483239 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.482976 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" containerID="cri-o://cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a" gracePeriod=30 Apr 21 04:51:59.483239 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.483039 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kube-rbac-proxy" containerID="cri-o://0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1" gracePeriod=30 Apr 21 04:51:59.556627 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.556596 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f"] Apr 21 04:51:59.556914 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.556901 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="storage-initializer" Apr 21 04:51:59.556966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.556917 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="storage-initializer" Apr 21 04:51:59.556966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.556925 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kube-rbac-proxy" Apr 21 04:51:59.556966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.556931 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kube-rbac-proxy" Apr 21 04:51:59.556966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.556939 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kserve-container" Apr 21 04:51:59.556966 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.556945 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kserve-container" Apr 21 04:51:59.557163 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.557026 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kube-rbac-proxy" Apr 21 04:51:59.557163 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.557046 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="18c3c3c7-021e-4aea-a6af-0e9416731c02" containerName="kserve-container" Apr 21 04:51:59.560099 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.560080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.562447 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.562427 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 21 04:51:59.562779 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.562746 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 21 04:51:59.570211 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.570191 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f"] Apr 21 04:51:59.642178 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.642155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87991a58-fda7-43d1-a412-d7c81b5401be-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.642285 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.642198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87991a58-fda7-43d1-a412-d7c81b5401be-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.642285 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.642219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87991a58-fda7-43d1-a412-d7c81b5401be-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.642359 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.642309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzjt5\" (UniqueName: \"kubernetes.io/projected/87991a58-fda7-43d1-a412-d7c81b5401be-kube-api-access-fzjt5\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.742934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.742864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzjt5\" (UniqueName: \"kubernetes.io/projected/87991a58-fda7-43d1-a412-d7c81b5401be-kube-api-access-fzjt5\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.742934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.742908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87991a58-fda7-43d1-a412-d7c81b5401be-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.743112 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.742943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87991a58-fda7-43d1-a412-d7c81b5401be-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.743112 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.742962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87991a58-fda7-43d1-a412-d7c81b5401be-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.743333 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.743309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87991a58-fda7-43d1-a412-d7c81b5401be-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.743623 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.743601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87991a58-fda7-43d1-a412-d7c81b5401be-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.745315 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.745296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87991a58-fda7-43d1-a412-d7c81b5401be-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.751028 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.751006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzjt5\" (UniqueName: \"kubernetes.io/projected/87991a58-fda7-43d1-a412-d7c81b5401be-kube-api-access-fzjt5\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.870168 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.870145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:51:59.991233 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:51:59.991190 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87991a58_fda7_43d1_a412_d7c81b5401be.slice/crio-2f499a264532424ebdcc37cacb76ff8d4d379c4d667085a8b195701b7ebf16f1 WatchSource:0}: Error finding container 2f499a264532424ebdcc37cacb76ff8d4d379c4d667085a8b195701b7ebf16f1: Status 404 returned error can't find the container with id 2f499a264532424ebdcc37cacb76ff8d4d379c4d667085a8b195701b7ebf16f1 Apr 21 04:51:59.994334 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:51:59.994293 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f"] Apr 21 04:52:00.523472 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:00.523440 2575 generic.go:358] "Generic (PLEG): container finished" podID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerID="0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1" exitCode=2 Apr 21 04:52:00.523902 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:00.523514 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerDied","Data":"0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1"} Apr 21 04:52:00.524803 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:00.524774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerStarted","Data":"d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485"} Apr 21 04:52:00.524913 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:00.524810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerStarted","Data":"2f499a264532424ebdcc37cacb76ff8d4d379c4d667085a8b195701b7ebf16f1"} Apr 21 04:52:02.717708 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.717686 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:52:02.767389 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.767366 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d50aab3-3600-4b57-b671-b1756a4ab93b-kserve-provision-location\") pod \"6d50aab3-3600-4b57-b671-b1756a4ab93b\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " Apr 21 04:52:02.767497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.767424 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls\") pod \"6d50aab3-3600-4b57-b671-b1756a4ab93b\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " Apr 21 04:52:02.767497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.767456 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5t7f\" (UniqueName: \"kubernetes.io/projected/6d50aab3-3600-4b57-b671-b1756a4ab93b-kube-api-access-k5t7f\") pod \"6d50aab3-3600-4b57-b671-b1756a4ab93b\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " Apr 21 04:52:02.767497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.767487 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d50aab3-3600-4b57-b671-b1756a4ab93b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"6d50aab3-3600-4b57-b671-b1756a4ab93b\" (UID: \"6d50aab3-3600-4b57-b671-b1756a4ab93b\") " Apr 21 04:52:02.767656 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.767635 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d50aab3-3600-4b57-b671-b1756a4ab93b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6d50aab3-3600-4b57-b671-b1756a4ab93b" (UID: "6d50aab3-3600-4b57-b671-b1756a4ab93b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:52:02.767880 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.767853 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d50aab3-3600-4b57-b671-b1756a4ab93b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "6d50aab3-3600-4b57-b671-b1756a4ab93b" (UID: "6d50aab3-3600-4b57-b671-b1756a4ab93b"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:52:02.769463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.769444 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d50aab3-3600-4b57-b671-b1756a4ab93b-kube-api-access-k5t7f" (OuterVolumeSpecName: "kube-api-access-k5t7f") pod "6d50aab3-3600-4b57-b671-b1756a4ab93b" (UID: "6d50aab3-3600-4b57-b671-b1756a4ab93b"). InnerVolumeSpecName "kube-api-access-k5t7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:52:02.769532 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.769482 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6d50aab3-3600-4b57-b671-b1756a4ab93b" (UID: "6d50aab3-3600-4b57-b671-b1756a4ab93b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:52:02.868007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.867955 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5t7f\" (UniqueName: \"kubernetes.io/projected/6d50aab3-3600-4b57-b671-b1756a4ab93b-kube-api-access-k5t7f\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:52:02.868007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.867976 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d50aab3-3600-4b57-b671-b1756a4ab93b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:52:02.868007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.867986 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d50aab3-3600-4b57-b671-b1756a4ab93b-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:52:02.868007 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:02.867997 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d50aab3-3600-4b57-b671-b1756a4ab93b-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:52:03.534793 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.534742 2575 generic.go:358] "Generic (PLEG): container finished" podID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerID="cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a" exitCode=0 Apr 21 04:52:03.534969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.534806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerDied","Data":"cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a"} Apr 21 04:52:03.534969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.534833 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" Apr 21 04:52:03.534969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.534840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj" event={"ID":"6d50aab3-3600-4b57-b671-b1756a4ab93b","Type":"ContainerDied","Data":"48d0f4f40b5e77ce054e8020593e964fb532d27b8a87d475d94b4d7a0c4dcd68"} Apr 21 04:52:03.534969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.534858 2575 scope.go:117] "RemoveContainer" containerID="0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1" Apr 21 04:52:03.542571 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.542553 2575 scope.go:117] "RemoveContainer" containerID="cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a" Apr 21 04:52:03.549001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.548985 2575 scope.go:117] "RemoveContainer" containerID="657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9" Apr 21 04:52:03.551426 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.551398 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj"] Apr 21 04:52:03.555431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.555410 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-fsmmj"] Apr 21 04:52:03.557096 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.557085 2575 scope.go:117] "RemoveContainer" containerID="0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1" Apr 21 04:52:03.557338 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:52:03.557320 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1\": container with ID starting with 0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1 not found: ID does not exist" containerID="0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1" Apr 21 04:52:03.557395 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.557346 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1"} err="failed to get container status \"0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1\": rpc error: code = NotFound desc = could not find container \"0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1\": container with ID starting with 0303c95210d125a874a1953e3a4527ce138df49bbb3dd9b0dc03e084b6b799b1 not found: ID does not exist" Apr 21 04:52:03.557395 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.557363 2575 scope.go:117] "RemoveContainer" containerID="cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a" Apr 21 04:52:03.557589 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:52:03.557572 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a\": container with ID starting with cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a not found: ID does not exist" containerID="cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a" Apr 21 04:52:03.557629 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.557596 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a"} err="failed to get container status \"cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a\": rpc error: code = NotFound desc = could not find container \"cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a\": container with ID starting with cd1638f7bf4afc3ce2010dd4d25b8a4f3a03618ff360698de6664448a47b518a not found: ID does not exist" Apr 21 04:52:03.557629 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.557613 2575 scope.go:117] "RemoveContainer" containerID="657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9" Apr 21 04:52:03.557860 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:52:03.557841 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9\": container with ID starting with 657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9 not found: ID does not exist" containerID="657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9" Apr 21 04:52:03.557937 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:03.557869 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9"} err="failed to get container status \"657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9\": rpc error: code = NotFound desc = could not find container \"657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9\": container with ID starting with 657c29068bb1e7d13dc3931db212bd08620d19b4f6347008c7903c49982632c9 not found: ID does not exist" Apr 21 04:52:04.538359 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:04.538327 2575 generic.go:358] "Generic (PLEG): container finished" podID="87991a58-fda7-43d1-a412-d7c81b5401be" containerID="d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485" exitCode=0 Apr 21 04:52:04.538821 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:04.538403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerDied","Data":"d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485"} Apr 21 04:52:05.027497 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:05.027464 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" path="/var/lib/kubelet/pods/6d50aab3-3600-4b57-b671-b1756a4ab93b/volumes" Apr 21 04:52:05.543154 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:05.543121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerStarted","Data":"77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319"} Apr 21 04:52:05.543574 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:05.543163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerStarted","Data":"959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82"} Apr 21 04:52:05.543574 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:05.543378 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:52:05.562256 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:05.562213 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" podStartSLOduration=6.562201406 podStartE2EDuration="6.562201406s" podCreationTimestamp="2026-04-21 04:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:52:05.561472059 +0000 UTC m=+3311.226062168" watchObservedRunningTime="2026-04-21 04:52:05.562201406 +0000 UTC m=+3311.226791544" Apr 21 04:52:06.546212 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:06.546184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:52:12.553854 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:12.553819 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:52:42.638829 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:42.638780 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 04:52:52.556854 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:52.556827 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:52:59.668991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.668948 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f"] Apr 21 04:52:59.669502 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.669409 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kserve-container" containerID="cri-o://959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82" gracePeriod=30 Apr 21 04:52:59.669823 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.669743 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kube-rbac-proxy" containerID="cri-o://77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319" gracePeriod=30 Apr 21 04:52:59.744778 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.744739 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7"] Apr 21 04:52:59.745118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745101 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kube-rbac-proxy" Apr 21 04:52:59.745200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745121 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kube-rbac-proxy" Apr 21 04:52:59.745200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745136 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" Apr 21 04:52:59.745200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745143 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" Apr 21 04:52:59.745200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745163 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="storage-initializer" Apr 21 04:52:59.745200 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745173 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="storage-initializer" Apr 21 04:52:59.745473 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745257 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kserve-container" Apr 21 04:52:59.745473 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.745269 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d50aab3-3600-4b57-b671-b1756a4ab93b" containerName="kube-rbac-proxy" Apr 21 04:52:59.748302 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.748282 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.751213 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.751193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 21 04:52:59.751329 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.751303 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 21 04:52:59.759781 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.759746 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7"] Apr 21 04:52:59.859151 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.859122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0692334-5af0-46c4-9cb7-d562b1e5606c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.859304 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.859171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfmdd\" (UniqueName: \"kubernetes.io/projected/f0692334-5af0-46c4-9cb7-d562b1e5606c-kube-api-access-wfmdd\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.859304 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.859291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0692334-5af0-46c4-9cb7-d562b1e5606c-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.859411 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.859330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0692334-5af0-46c4-9cb7-d562b1e5606c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.960641 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.960571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfmdd\" (UniqueName: \"kubernetes.io/projected/f0692334-5af0-46c4-9cb7-d562b1e5606c-kube-api-access-wfmdd\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.960641 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.960624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0692334-5af0-46c4-9cb7-d562b1e5606c-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.960882 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.960860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0692334-5af0-46c4-9cb7-d562b1e5606c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.961009 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.960989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0692334-5af0-46c4-9cb7-d562b1e5606c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.961229 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.961211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0692334-5af0-46c4-9cb7-d562b1e5606c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.961568 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.961549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0692334-5af0-46c4-9cb7-d562b1e5606c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.963016 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.962998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0692334-5af0-46c4-9cb7-d562b1e5606c-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:52:59.968560 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:52:59.968538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfmdd\" (UniqueName: \"kubernetes.io/projected/f0692334-5af0-46c4-9cb7-d562b1e5606c-kube-api-access-wfmdd\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k4db7\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:53:00.057926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:00.057890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:53:00.174553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:00.174529 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7"] Apr 21 04:53:00.176733 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:53:00.176708 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0692334_5af0_46c4_9cb7_d562b1e5606c.slice/crio-e2d62625f7f4d33f3d475a130a95dfd4b57cc8aab62474705c5c8d826030f142 WatchSource:0}: Error finding container e2d62625f7f4d33f3d475a130a95dfd4b57cc8aab62474705c5c8d826030f142: Status 404 returned error can't find the container with id e2d62625f7f4d33f3d475a130a95dfd4b57cc8aab62474705c5c8d826030f142 Apr 21 04:53:00.695862 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:00.695833 2575 generic.go:358] "Generic (PLEG): container finished" podID="87991a58-fda7-43d1-a412-d7c81b5401be" containerID="77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319" exitCode=2 Apr 21 04:53:00.696231 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:00.695903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerDied","Data":"77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319"} Apr 21 04:53:00.697223 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:00.697201 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerStarted","Data":"dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4"} Apr 21 04:53:00.697314 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:00.697228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerStarted","Data":"e2d62625f7f4d33f3d475a130a95dfd4b57cc8aab62474705c5c8d826030f142"} Apr 21 04:53:02.549987 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:02.549943 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 21 04:53:04.708472 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:04.708437 2575 generic.go:358] "Generic (PLEG): container finished" podID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerID="dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4" exitCode=0 Apr 21 04:53:04.708880 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:04.708508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerDied","Data":"dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4"} Apr 21 04:53:05.713991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:05.713956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerStarted","Data":"0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50"} Apr 21 04:53:05.714482 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:05.714000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerStarted","Data":"a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121"} Apr 21 04:53:05.714482 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:05.714215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:53:05.736336 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:05.736292 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podStartSLOduration=6.73627674 podStartE2EDuration="6.73627674s" podCreationTimestamp="2026-04-21 04:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:53:05.734604388 +0000 UTC m=+3371.399194484" watchObservedRunningTime="2026-04-21 04:53:05.73627674 +0000 UTC m=+3371.400866828" Apr 21 04:53:06.507855 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.507827 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:53:06.611054 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.610991 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87991a58-fda7-43d1-a412-d7c81b5401be-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"87991a58-fda7-43d1-a412-d7c81b5401be\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " Apr 21 04:53:06.611054 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.611028 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzjt5\" (UniqueName: \"kubernetes.io/projected/87991a58-fda7-43d1-a412-d7c81b5401be-kube-api-access-fzjt5\") pod \"87991a58-fda7-43d1-a412-d7c81b5401be\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " Apr 21 04:53:06.611237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.611064 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87991a58-fda7-43d1-a412-d7c81b5401be-kserve-provision-location\") pod \"87991a58-fda7-43d1-a412-d7c81b5401be\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " Apr 21 04:53:06.611237 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.611087 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87991a58-fda7-43d1-a412-d7c81b5401be-proxy-tls\") pod \"87991a58-fda7-43d1-a412-d7c81b5401be\" (UID: \"87991a58-fda7-43d1-a412-d7c81b5401be\") " Apr 21 04:53:06.611355 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.611310 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87991a58-fda7-43d1-a412-d7c81b5401be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87991a58-fda7-43d1-a412-d7c81b5401be" (UID: "87991a58-fda7-43d1-a412-d7c81b5401be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:53:06.611526 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.611503 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87991a58-fda7-43d1-a412-d7c81b5401be-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "87991a58-fda7-43d1-a412-d7c81b5401be" (UID: "87991a58-fda7-43d1-a412-d7c81b5401be"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:53:06.612970 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.612946 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87991a58-fda7-43d1-a412-d7c81b5401be-kube-api-access-fzjt5" (OuterVolumeSpecName: "kube-api-access-fzjt5") pod "87991a58-fda7-43d1-a412-d7c81b5401be" (UID: "87991a58-fda7-43d1-a412-d7c81b5401be"). InnerVolumeSpecName "kube-api-access-fzjt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:53:06.613079 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.613032 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87991a58-fda7-43d1-a412-d7c81b5401be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87991a58-fda7-43d1-a412-d7c81b5401be" (UID: "87991a58-fda7-43d1-a412-d7c81b5401be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:53:06.712476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.712455 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87991a58-fda7-43d1-a412-d7c81b5401be-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:53:06.712476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.712475 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87991a58-fda7-43d1-a412-d7c81b5401be-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:53:06.712613 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.712486 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87991a58-fda7-43d1-a412-d7c81b5401be-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:53:06.712613 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.712496 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fzjt5\" (UniqueName: \"kubernetes.io/projected/87991a58-fda7-43d1-a412-d7c81b5401be-kube-api-access-fzjt5\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:53:06.718016 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.717991 2575 generic.go:358] "Generic (PLEG): container finished" podID="87991a58-fda7-43d1-a412-d7c81b5401be" containerID="959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82" exitCode=0 Apr 21 04:53:06.718383 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.718073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerDied","Data":"959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82"} Apr 21 04:53:06.718383 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.718088 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" Apr 21 04:53:06.718383 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.718110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f" event={"ID":"87991a58-fda7-43d1-a412-d7c81b5401be","Type":"ContainerDied","Data":"2f499a264532424ebdcc37cacb76ff8d4d379c4d667085a8b195701b7ebf16f1"} Apr 21 04:53:06.718383 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.718127 2575 scope.go:117] "RemoveContainer" containerID="77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319" Apr 21 04:53:06.718812 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.718793 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:53:06.720150 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.720125 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:53:06.726780 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.726694 2575 scope.go:117] "RemoveContainer" containerID="959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82" Apr 21 04:53:06.737275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.737256 2575 scope.go:117] "RemoveContainer" containerID="d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485" Apr 21 04:53:06.743718 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.743701 2575 scope.go:117] "RemoveContainer" containerID="77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319" Apr 21 04:53:06.744015 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:53:06.743995 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319\": container with ID starting with 77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319 not found: ID does not exist" containerID="77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319" Apr 21 04:53:06.744077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.744022 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319"} err="failed to get container status \"77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319\": rpc error: code = NotFound desc = could not find container \"77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319\": container with ID starting with 77cb8b5a9df7d8aafc76fd54ea38a8aad20552a9376f8ca08851cf38d24a5319 not found: ID does not exist" Apr 21 04:53:06.744077 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.744038 2575 scope.go:117] "RemoveContainer" containerID="959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82" Apr 21 04:53:06.744270 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:53:06.744253 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82\": container with ID starting with 959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82 not found: ID does not exist" containerID="959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82" Apr 21 04:53:06.744315 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.744273 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82"} err="failed to get container status \"959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82\": rpc error: code = NotFound desc = could not find container \"959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82\": container with ID starting with 959b077b9a8c28e32f4a3b5fc8267ca058808fa257db0a18dd5cfb32abe5af82 not found: ID does not exist" Apr 21 04:53:06.744315 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.744287 2575 scope.go:117] "RemoveContainer" containerID="d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485" Apr 21 04:53:06.744518 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:53:06.744501 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485\": container with ID starting with d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485 not found: ID does not exist" containerID="d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485" Apr 21 04:53:06.744573 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.744522 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485"} err="failed to get container status \"d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485\": rpc error: code = NotFound desc = could not find container \"d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485\": container with ID starting with d1423e623a440f17670f42554b042212971e9c2d6d07f2ae85e598723b7a1485 not found: ID does not exist" Apr 21 04:53:06.748538 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.748518 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f"] Apr 21 04:53:06.754031 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:06.754010 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-bl87f"] Apr 21 04:53:07.027242 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:07.027210 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" path="/var/lib/kubelet/pods/87991a58-fda7-43d1-a412-d7c81b5401be/volumes" Apr 21 04:53:07.721859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:07.721819 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:53:12.725995 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:12.725967 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:53:12.726456 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:12.726429 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:53:22.726743 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:22.726703 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:53:32.726859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:32.726819 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:53:42.727127 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:42.727087 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:53:52.726469 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:53:52.726386 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:54:02.727498 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:02.727470 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:54:09.849562 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.849531 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7"] Apr 21 04:54:09.849991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.849811 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" containerID="cri-o://a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121" gracePeriod=30 Apr 21 04:54:09.849991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.849858 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kube-rbac-proxy" containerID="cri-o://0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50" gracePeriod=30 Apr 21 04:54:09.945441 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945411 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj"] Apr 21 04:54:09.945693 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945682 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="storage-initializer" Apr 21 04:54:09.945739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945695 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="storage-initializer" Apr 21 04:54:09.945739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945706 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kserve-container" Apr 21 04:54:09.945739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945712 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kserve-container" Apr 21 04:54:09.945739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945731 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kube-rbac-proxy" Apr 21 04:54:09.945739 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945737 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kube-rbac-proxy" Apr 21 04:54:09.945931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945791 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kserve-container" Apr 21 04:54:09.945931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.945802 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="87991a58-fda7-43d1-a412-d7c81b5401be" containerName="kube-rbac-proxy" Apr 21 04:54:09.948679 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.948663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:09.950900 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.950878 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 21 04:54:09.951105 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.951090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 21 04:54:09.951185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.951125 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 21 04:54:09.957927 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:09.957907 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj"] Apr 21 04:54:10.042501 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.042475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwgs6\" (UniqueName: \"kubernetes.io/projected/f84a184a-823c-49f4-93a1-ba1fd0267128-kube-api-access-wwgs6\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.042634 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.042518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.042634 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.042542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f84a184a-823c-49f4-93a1-ba1fd0267128-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.042634 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.042594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f84a184a-823c-49f4-93a1-ba1fd0267128-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.143735 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.143712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.143873 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.143744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f84a184a-823c-49f4-93a1-ba1fd0267128-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.143941 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:54:10.143865 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 21 04:54:10.143941 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.143878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f84a184a-823c-49f4-93a1-ba1fd0267128-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.144030 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:54:10.143954 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls podName:f84a184a-823c-49f4-93a1-ba1fd0267128 nodeName:}" failed. No retries permitted until 2026-04-21 04:54:10.643920091 +0000 UTC m=+3436.308510164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls") pod "isvc-sklearn-s3-predictor-88457d696-nk8sj" (UID: "f84a184a-823c-49f4-93a1-ba1fd0267128") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 21 04:54:10.144030 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.144002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwgs6\" (UniqueName: \"kubernetes.io/projected/f84a184a-823c-49f4-93a1-ba1fd0267128-kube-api-access-wwgs6\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.144203 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.144187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f84a184a-823c-49f4-93a1-ba1fd0267128-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.144410 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.144393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f84a184a-823c-49f4-93a1-ba1fd0267128-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.152660 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.152633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwgs6\" (UniqueName: \"kubernetes.io/projected/f84a184a-823c-49f4-93a1-ba1fd0267128-kube-api-access-wwgs6\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.647865 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.647837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.650157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.650132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nk8sj\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.860096 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.860063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:10.895581 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.895544 2575 generic.go:358] "Generic (PLEG): container finished" podID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerID="0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50" exitCode=2 Apr 21 04:54:10.896348 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.895611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerDied","Data":"0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50"} Apr 21 04:54:10.977730 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.977708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj"] Apr 21 04:54:10.979777 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:54:10.979726 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84a184a_823c_49f4_93a1_ba1fd0267128.slice/crio-100a1a1ae24a203eb26fa396d8f99c9ffb348683c1bff7cf781955ead1f67ec5 WatchSource:0}: Error finding container 100a1a1ae24a203eb26fa396d8f99c9ffb348683c1bff7cf781955ead1f67ec5: Status 404 returned error can't find the container with id 100a1a1ae24a203eb26fa396d8f99c9ffb348683c1bff7cf781955ead1f67ec5 Apr 21 04:54:10.981730 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:10.981713 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:54:11.899431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:11.899390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerStarted","Data":"6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d"} Apr 21 04:54:11.899431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:11.899434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerStarted","Data":"100a1a1ae24a203eb26fa396d8f99c9ffb348683c1bff7cf781955ead1f67ec5"} Apr 21 04:54:12.722143 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:12.722096 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.60:8643/healthz\": dial tcp 10.132.0.60:8643: connect: connection refused" Apr 21 04:54:12.726424 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:12.726396 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 21 04:54:12.903441 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:12.903404 2575 generic.go:358] "Generic (PLEG): container finished" podID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerID="6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d" exitCode=0 Apr 21 04:54:12.903943 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:12.903487 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerDied","Data":"6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d"} Apr 21 04:54:13.098490 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.098467 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:54:13.167470 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.167441 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0692334-5af0-46c4-9cb7-d562b1e5606c-proxy-tls\") pod \"f0692334-5af0-46c4-9cb7-d562b1e5606c\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " Apr 21 04:54:13.167642 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.167510 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0692334-5af0-46c4-9cb7-d562b1e5606c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"f0692334-5af0-46c4-9cb7-d562b1e5606c\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " Apr 21 04:54:13.167642 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.167579 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfmdd\" (UniqueName: \"kubernetes.io/projected/f0692334-5af0-46c4-9cb7-d562b1e5606c-kube-api-access-wfmdd\") pod \"f0692334-5af0-46c4-9cb7-d562b1e5606c\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " Apr 21 04:54:13.167642 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.167607 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0692334-5af0-46c4-9cb7-d562b1e5606c-kserve-provision-location\") pod \"f0692334-5af0-46c4-9cb7-d562b1e5606c\" (UID: \"f0692334-5af0-46c4-9cb7-d562b1e5606c\") " Apr 21 04:54:13.167922 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.167889 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0692334-5af0-46c4-9cb7-d562b1e5606c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f0692334-5af0-46c4-9cb7-d562b1e5606c" (UID: "f0692334-5af0-46c4-9cb7-d562b1e5606c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:54:13.168027 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.167922 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0692334-5af0-46c4-9cb7-d562b1e5606c-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "f0692334-5af0-46c4-9cb7-d562b1e5606c" (UID: "f0692334-5af0-46c4-9cb7-d562b1e5606c"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:54:13.169377 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.169352 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0692334-5af0-46c4-9cb7-d562b1e5606c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f0692334-5af0-46c4-9cb7-d562b1e5606c" (UID: "f0692334-5af0-46c4-9cb7-d562b1e5606c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:54:13.169457 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.169442 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0692334-5af0-46c4-9cb7-d562b1e5606c-kube-api-access-wfmdd" (OuterVolumeSpecName: "kube-api-access-wfmdd") pod "f0692334-5af0-46c4-9cb7-d562b1e5606c" (UID: "f0692334-5af0-46c4-9cb7-d562b1e5606c"). InnerVolumeSpecName "kube-api-access-wfmdd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:54:13.268161 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.268103 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0692334-5af0-46c4-9cb7-d562b1e5606c-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:54:13.268161 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.268129 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0692334-5af0-46c4-9cb7-d562b1e5606c-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:54:13.268161 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.268139 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wfmdd\" (UniqueName: \"kubernetes.io/projected/f0692334-5af0-46c4-9cb7-d562b1e5606c-kube-api-access-wfmdd\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:54:13.268161 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.268148 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0692334-5af0-46c4-9cb7-d562b1e5606c-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:54:13.908167 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.908129 2575 generic.go:358] "Generic (PLEG): container finished" podID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerID="a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121" exitCode=0 Apr 21 04:54:13.908599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.908209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerDied","Data":"a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121"} Apr 21 04:54:13.908599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.908243 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" Apr 21 04:54:13.908599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.908247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7" event={"ID":"f0692334-5af0-46c4-9cb7-d562b1e5606c","Type":"ContainerDied","Data":"e2d62625f7f4d33f3d475a130a95dfd4b57cc8aab62474705c5c8d826030f142"} Apr 21 04:54:13.908599 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.908266 2575 scope.go:117] "RemoveContainer" containerID="0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50" Apr 21 04:54:13.910329 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.910308 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerStarted","Data":"222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a"} Apr 21 04:54:13.910445 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.910335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerStarted","Data":"36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0"} Apr 21 04:54:13.910559 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.910542 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:13.916246 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.916226 2575 scope.go:117] "RemoveContainer" containerID="a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121" Apr 21 04:54:13.925417 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.925400 2575 scope.go:117] "RemoveContainer" containerID="dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4" Apr 21 04:54:13.932185 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.932167 2575 scope.go:117] "RemoveContainer" containerID="0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50" Apr 21 04:54:13.932454 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:54:13.932434 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50\": container with ID starting with 0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50 not found: ID does not exist" containerID="0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50" Apr 21 04:54:13.932554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.932460 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50"} err="failed to get container status \"0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50\": rpc error: code = NotFound desc = could not find container \"0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50\": container with ID starting with 0efd840504a2c02e21e45318965e02db96c4dcab5d3719ec2da9c3ccb2825d50 not found: ID does not exist" Apr 21 04:54:13.932554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.932477 2575 scope.go:117] "RemoveContainer" containerID="a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121" Apr 21 04:54:13.932740 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:54:13.932717 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121\": container with ID starting with a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121 not found: ID does not exist" containerID="a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121" Apr 21 04:54:13.932856 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.932748 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121"} err="failed to get container status \"a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121\": rpc error: code = NotFound desc = could not find container \"a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121\": container with ID starting with a3b3703794331dee4dfcd4b60cc63d28bec634e392d6fd8e1c0969cd9111a121 not found: ID does not exist" Apr 21 04:54:13.932856 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.932785 2575 scope.go:117] "RemoveContainer" containerID="dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4" Apr 21 04:54:13.933029 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:54:13.933014 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4\": container with ID starting with dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4 not found: ID does not exist" containerID="dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4" Apr 21 04:54:13.933081 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.933032 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4"} err="failed to get container status \"dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4\": rpc error: code = NotFound desc = could not find container \"dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4\": container with ID starting with dc21ca385bc4930952c23ecf3a7374f986990eb0c63ac5742cacbcdeb58504f4 not found: ID does not exist" Apr 21 04:54:13.939495 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.939449 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podStartSLOduration=4.939435967 podStartE2EDuration="4.939435967s" podCreationTimestamp="2026-04-21 04:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:54:13.938043457 +0000 UTC m=+3439.602633544" watchObservedRunningTime="2026-04-21 04:54:13.939435967 +0000 UTC m=+3439.604026054" Apr 21 04:54:13.959155 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.959133 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7"] Apr 21 04:54:13.965882 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:13.965850 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k4db7"] Apr 21 04:54:14.913741 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:14.913709 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:14.914921 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:14.914897 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:54:15.031997 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:15.031966 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" path="/var/lib/kubelet/pods/f0692334-5af0-46c4-9cb7-d562b1e5606c/volumes" Apr 21 04:54:15.917192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:15.917155 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:54:20.921504 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:20.921475 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:54:20.921953 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:20.921928 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:54:30.921937 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:30.921890 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:54:40.922699 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:40.922659 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:54:50.922419 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:54:50.922380 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:55:00.922076 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:00.922033 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:55:10.922868 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:10.922832 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:55:20.923140 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:20.923068 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:55:30.044005 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.043971 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj"] Apr 21 04:55:30.044431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.044287 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" containerID="cri-o://36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0" gracePeriod=30 Apr 21 04:55:30.044431 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.044316 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kube-rbac-proxy" containerID="cri-o://222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a" gracePeriod=30 Apr 21 04:55:30.185476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185450 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq"] Apr 21 04:55:30.185727 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185716 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kube-rbac-proxy" Apr 21 04:55:30.185832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185729 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kube-rbac-proxy" Apr 21 04:55:30.185832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185746 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" Apr 21 04:55:30.185832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185751 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" Apr 21 04:55:30.185832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185786 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="storage-initializer" Apr 21 04:55:30.185832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185792 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="storage-initializer" Apr 21 04:55:30.186018 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185845 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kserve-container" Apr 21 04:55:30.186018 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.185856 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0692334-5af0-46c4-9cb7-d562b1e5606c" containerName="kube-rbac-proxy" Apr 21 04:55:30.188681 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.188662 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.191080 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.191060 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 21 04:55:30.191204 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.191186 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 21 04:55:30.191266 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.191224 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 04:55:30.198080 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.198060 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq"] Apr 21 04:55:30.306576 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.306496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9f26e39-da8c-4acf-8bfb-bb1489553724-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.306576 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.306530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.306576 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.306570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.306803 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.306626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb665\" (UniqueName: \"kubernetes.io/projected/d9f26e39-da8c-4acf-8bfb-bb1489553724-kube-api-access-tb665\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.306803 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.306698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f26e39-da8c-4acf-8bfb-bb1489553724-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.407554 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.407525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tb665\" (UniqueName: \"kubernetes.io/projected/d9f26e39-da8c-4acf-8bfb-bb1489553724-kube-api-access-tb665\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.407697 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.407560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f26e39-da8c-4acf-8bfb-bb1489553724-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.407697 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.407600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9f26e39-da8c-4acf-8bfb-bb1489553724-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.407697 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.407623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.407697 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.407658 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.408026 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.408001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f26e39-da8c-4acf-8bfb-bb1489553724-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.408312 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.408292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.408359 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.408294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.409987 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.409970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9f26e39-da8c-4acf-8bfb-bb1489553724-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.416244 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.416225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb665\" (UniqueName: \"kubernetes.io/projected/d9f26e39-da8c-4acf-8bfb-bb1489553724-kube-api-access-tb665\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.498877 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.498852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:30.621157 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.621125 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq"] Apr 21 04:55:30.623897 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:55:30.623870 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f26e39_da8c_4acf_8bfb_bb1489553724.slice/crio-830f7ebca6b7be03d79fc6a4eba7d7556ff8329855182ef2983a09732ab79058 WatchSource:0}: Error finding container 830f7ebca6b7be03d79fc6a4eba7d7556ff8329855182ef2983a09732ab79058: Status 404 returned error can't find the container with id 830f7ebca6b7be03d79fc6a4eba7d7556ff8329855182ef2983a09732ab79058 Apr 21 04:55:30.917713 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.917681 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.61:8643/healthz\": dial tcp 10.132.0.61:8643: connect: connection refused" Apr 21 04:55:30.922039 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:30.922015 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 21 04:55:31.124859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:31.124815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerStarted","Data":"da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f"} Apr 21 04:55:31.124859 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:31.124850 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerStarted","Data":"830f7ebca6b7be03d79fc6a4eba7d7556ff8329855182ef2983a09732ab79058"} Apr 21 04:55:31.126744 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:31.126718 2575 generic.go:358] "Generic (PLEG): container finished" podID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerID="222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a" exitCode=2 Apr 21 04:55:31.126905 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:31.126791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerDied","Data":"222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a"} Apr 21 04:55:32.131423 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:32.131384 2575 generic.go:358] "Generic (PLEG): container finished" podID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerID="da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f" exitCode=0 Apr 21 04:55:32.131836 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:32.131465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerDied","Data":"da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f"} Apr 21 04:55:33.136255 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:33.136224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerStarted","Data":"ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708"} Apr 21 04:55:33.136255 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:33.136256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerStarted","Data":"189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856"} Apr 21 04:55:33.136715 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:33.136340 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:33.155285 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:33.155238 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podStartSLOduration=3.15522308 podStartE2EDuration="3.15522308s" podCreationTimestamp="2026-04-21 04:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:55:33.154032383 +0000 UTC m=+3518.818622469" watchObservedRunningTime="2026-04-21 04:55:33.15522308 +0000 UTC m=+3518.819813165" Apr 21 04:55:33.988333 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:33.988312 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:55:34.037374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.037340 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwgs6\" (UniqueName: \"kubernetes.io/projected/f84a184a-823c-49f4-93a1-ba1fd0267128-kube-api-access-wwgs6\") pod \"f84a184a-823c-49f4-93a1-ba1fd0267128\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " Apr 21 04:55:34.037509 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.037424 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls\") pod \"f84a184a-823c-49f4-93a1-ba1fd0267128\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " Apr 21 04:55:34.037580 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.037507 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f84a184a-823c-49f4-93a1-ba1fd0267128-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"f84a184a-823c-49f4-93a1-ba1fd0267128\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " Apr 21 04:55:34.037884 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.037857 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f84a184a-823c-49f4-93a1-ba1fd0267128-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "f84a184a-823c-49f4-93a1-ba1fd0267128" (UID: "f84a184a-823c-49f4-93a1-ba1fd0267128"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:55:34.039839 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.039816 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84a184a-823c-49f4-93a1-ba1fd0267128-kube-api-access-wwgs6" (OuterVolumeSpecName: "kube-api-access-wwgs6") pod "f84a184a-823c-49f4-93a1-ba1fd0267128" (UID: "f84a184a-823c-49f4-93a1-ba1fd0267128"). InnerVolumeSpecName "kube-api-access-wwgs6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:55:34.039914 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.039898 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f84a184a-823c-49f4-93a1-ba1fd0267128" (UID: "f84a184a-823c-49f4-93a1-ba1fd0267128"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:55:34.137899 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.137870 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f84a184a-823c-49f4-93a1-ba1fd0267128-kserve-provision-location\") pod \"f84a184a-823c-49f4-93a1-ba1fd0267128\" (UID: \"f84a184a-823c-49f4-93a1-ba1fd0267128\") " Apr 21 04:55:34.138226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.138009 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f84a184a-823c-49f4-93a1-ba1fd0267128-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:55:34.138226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.138022 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwgs6\" (UniqueName: \"kubernetes.io/projected/f84a184a-823c-49f4-93a1-ba1fd0267128-kube-api-access-wwgs6\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:55:34.138226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.138032 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f84a184a-823c-49f4-93a1-ba1fd0267128-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:55:34.138226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.138208 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84a184a-823c-49f4-93a1-ba1fd0267128-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f84a184a-823c-49f4-93a1-ba1fd0267128" (UID: "f84a184a-823c-49f4-93a1-ba1fd0267128"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:55:34.142569 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.142541 2575 generic.go:358] "Generic (PLEG): container finished" podID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerID="36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0" exitCode=0 Apr 21 04:55:34.142692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.142618 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" Apr 21 04:55:34.142692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.142622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerDied","Data":"36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0"} Apr 21 04:55:34.142692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.142658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj" event={"ID":"f84a184a-823c-49f4-93a1-ba1fd0267128","Type":"ContainerDied","Data":"100a1a1ae24a203eb26fa396d8f99c9ffb348683c1bff7cf781955ead1f67ec5"} Apr 21 04:55:34.142692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.142678 2575 scope.go:117] "RemoveContainer" containerID="222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a" Apr 21 04:55:34.143525 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.143508 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:34.145095 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.145068 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:55:34.152634 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.152608 2575 scope.go:117] "RemoveContainer" containerID="36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0" Apr 21 04:55:34.159535 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.159519 2575 scope.go:117] "RemoveContainer" containerID="6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d" Apr 21 04:55:34.166054 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.166037 2575 scope.go:117] "RemoveContainer" containerID="222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a" Apr 21 04:55:34.166278 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:55:34.166260 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a\": container with ID starting with 222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a not found: ID does not exist" containerID="222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a" Apr 21 04:55:34.166346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.166290 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a"} err="failed to get container status \"222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a\": rpc error: code = NotFound desc = could not find container \"222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a\": container with ID starting with 222c70b717f7f13ebe32505d1c28fd707553065fb749f0b2681071cf2f896b2a not found: ID does not exist" Apr 21 04:55:34.166346 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.166313 2575 scope.go:117] "RemoveContainer" containerID="36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0" Apr 21 04:55:34.166542 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:55:34.166525 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0\": container with ID starting with 36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0 not found: ID does not exist" containerID="36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0" Apr 21 04:55:34.166583 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.166547 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0"} err="failed to get container status \"36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0\": rpc error: code = NotFound desc = could not find container \"36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0\": container with ID starting with 36e406bac9407661bdac6075b7dcb9f0a1f1c5ebf99af04139cb21d8090cabf0 not found: ID does not exist" Apr 21 04:55:34.166583 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.166562 2575 scope.go:117] "RemoveContainer" containerID="6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d" Apr 21 04:55:34.166784 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:55:34.166746 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d\": container with ID starting with 6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d not found: ID does not exist" containerID="6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d" Apr 21 04:55:34.166832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.166782 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d"} err="failed to get container status \"6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d\": rpc error: code = NotFound desc = could not find container \"6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d\": container with ID starting with 6980a390dde280ecc7c1d269fef2ed9e4f153a5033092b5eb8b37cdafec5fc0d not found: ID does not exist" Apr 21 04:55:34.169692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.169674 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj"] Apr 21 04:55:34.172405 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.172381 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nk8sj"] Apr 21 04:55:34.238753 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:34.238734 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f84a184a-823c-49f4-93a1-ba1fd0267128-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:55:35.027924 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:35.027889 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" path="/var/lib/kubelet/pods/f84a184a-823c-49f4-93a1-ba1fd0267128/volumes" Apr 21 04:55:35.147047 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:35.147013 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:55:40.151490 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:40.151461 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:55:40.151954 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:40.151935 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:55:50.152645 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:55:50.152606 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:56:00.152504 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:00.152466 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:56:10.151991 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:10.151948 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:56:20.152075 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:20.152027 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:56:30.152720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:30.152678 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 21 04:56:40.152596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:40.152567 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:56:50.208288 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:50.208256 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq"] Apr 21 04:56:50.208693 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:50.208640 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" containerID="cri-o://189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856" gracePeriod=30 Apr 21 04:56:50.208799 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:50.208727 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kube-rbac-proxy" containerID="cri-o://ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708" gracePeriod=30 Apr 21 04:56:50.350680 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:50.350652 2575 generic.go:358] "Generic (PLEG): container finished" podID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerID="ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708" exitCode=2 Apr 21 04:56:50.350829 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:50.350710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerDied","Data":"ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708"} Apr 21 04:56:51.302316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302284 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk"] Apr 21 04:56:51.302692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302619 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" Apr 21 04:56:51.302692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302636 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" Apr 21 04:56:51.302692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302658 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kube-rbac-proxy" Apr 21 04:56:51.302692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302663 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kube-rbac-proxy" Apr 21 04:56:51.302692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302673 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="storage-initializer" Apr 21 04:56:51.302692 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302680 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="storage-initializer" Apr 21 04:56:51.302931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302742 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kube-rbac-proxy" Apr 21 04:56:51.302931 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.302777 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f84a184a-823c-49f4-93a1-ba1fd0267128" containerName="kserve-container" Apr 21 04:56:51.305686 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.305670 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.307985 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.307953 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 21 04:56:51.308087 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.308046 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 21 04:56:51.313613 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.313594 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk"] Apr 21 04:56:51.463183 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.463147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.463183 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.463184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr75t\" (UniqueName: \"kubernetes.io/projected/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kube-api-access-nr75t\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.463418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.463206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.463418 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.463222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.564035 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.563945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr75t\" (UniqueName: \"kubernetes.io/projected/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kube-api-access-nr75t\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.564035 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.563989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.564035 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.564014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.564297 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:56:51.564102 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 21 04:56:51.564297 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.564119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.564297 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:56:51.564174 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls podName:1b95e3a3-85cd-481f-9aa6-6bfe4c86d697 nodeName:}" failed. No retries permitted until 2026-04-21 04:56:52.064156981 +0000 UTC m=+3597.728747049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" (UID: "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 21 04:56:51.564473 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.564432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.564787 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.564742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:51.572580 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:51.572557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr75t\" (UniqueName: \"kubernetes.io/projected/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kube-api-access-nr75t\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:52.067617 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:52.067583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:52.069925 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:52.069898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:52.216563 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:52.216518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:56:52.329964 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:52.329892 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk"] Apr 21 04:56:52.333381 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:56:52.333357 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b95e3a3_85cd_481f_9aa6_6bfe4c86d697.slice/crio-275494f0a87b48e54e3810481d978f6bf53109addabdc3996be6b8f509a4c8f8 WatchSource:0}: Error finding container 275494f0a87b48e54e3810481d978f6bf53109addabdc3996be6b8f509a4c8f8: Status 404 returned error can't find the container with id 275494f0a87b48e54e3810481d978f6bf53109addabdc3996be6b8f509a4c8f8 Apr 21 04:56:52.357277 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:52.357254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" event={"ID":"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697","Type":"ContainerStarted","Data":"275494f0a87b48e54e3810481d978f6bf53109addabdc3996be6b8f509a4c8f8"} Apr 21 04:56:53.361681 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:53.361647 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" event={"ID":"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697","Type":"ContainerStarted","Data":"ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2"} Apr 21 04:56:54.345897 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.345873 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:56:54.366826 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.366791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerDied","Data":"189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856"} Apr 21 04:56:54.366826 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.366752 2575 generic.go:358] "Generic (PLEG): container finished" podID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerID="189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856" exitCode=0 Apr 21 04:56:54.367267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.366836 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" event={"ID":"d9f26e39-da8c-4acf-8bfb-bb1489553724","Type":"ContainerDied","Data":"830f7ebca6b7be03d79fc6a4eba7d7556ff8329855182ef2983a09732ab79058"} Apr 21 04:56:54.367267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.366867 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq" Apr 21 04:56:54.367267 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.366870 2575 scope.go:117] "RemoveContainer" containerID="ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708" Apr 21 04:56:54.375483 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.375467 2575 scope.go:117] "RemoveContainer" containerID="189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856" Apr 21 04:56:54.382098 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.382078 2575 scope.go:117] "RemoveContainer" containerID="da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f" Apr 21 04:56:54.388486 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.388469 2575 scope.go:117] "RemoveContainer" containerID="ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708" Apr 21 04:56:54.388714 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:56:54.388695 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708\": container with ID starting with ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708 not found: ID does not exist" containerID="ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708" Apr 21 04:56:54.388773 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.388724 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708"} err="failed to get container status \"ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708\": rpc error: code = NotFound desc = could not find container \"ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708\": container with ID starting with ba0902efd96a3d48f653979cf3f0da990d0e8a737f17502d3ca55166d7da2708 not found: ID does not exist" Apr 21 04:56:54.388773 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.388741 2575 scope.go:117] "RemoveContainer" containerID="189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856" Apr 21 04:56:54.389013 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:56:54.388984 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856\": container with ID starting with 189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856 not found: ID does not exist" containerID="189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856" Apr 21 04:56:54.389063 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.389011 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856"} err="failed to get container status \"189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856\": rpc error: code = NotFound desc = could not find container \"189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856\": container with ID starting with 189118a4c10ff7d7cfbd0f6fcdbd14c3752a90c855c846c892afd5a566a92856 not found: ID does not exist" Apr 21 04:56:54.389063 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.389029 2575 scope.go:117] "RemoveContainer" containerID="da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f" Apr 21 04:56:54.389224 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:56:54.389205 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f\": container with ID starting with da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f not found: ID does not exist" containerID="da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f" Apr 21 04:56:54.389270 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.389229 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f"} err="failed to get container status \"da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f\": rpc error: code = NotFound desc = could not find container \"da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f\": container with ID starting with da2e7c9d58eda84b36ee1e208bcb9b8f89c25ace675f1ab1c7b84975c234886f not found: ID does not exist" Apr 21 04:56:54.484656 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.484604 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f26e39-da8c-4acf-8bfb-bb1489553724-kserve-provision-location\") pod \"d9f26e39-da8c-4acf-8bfb-bb1489553724\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " Apr 21 04:56:54.484776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.484653 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"d9f26e39-da8c-4acf-8bfb-bb1489553724\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " Apr 21 04:56:54.484776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.484674 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9f26e39-da8c-4acf-8bfb-bb1489553724-proxy-tls\") pod \"d9f26e39-da8c-4acf-8bfb-bb1489553724\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " Apr 21 04:56:54.484776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.484701 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb665\" (UniqueName: \"kubernetes.io/projected/d9f26e39-da8c-4acf-8bfb-bb1489553724-kube-api-access-tb665\") pod \"d9f26e39-da8c-4acf-8bfb-bb1489553724\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " Apr 21 04:56:54.484776 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.484727 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-cabundle-cert\") pod \"d9f26e39-da8c-4acf-8bfb-bb1489553724\" (UID: \"d9f26e39-da8c-4acf-8bfb-bb1489553724\") " Apr 21 04:56:54.485076 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.485053 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f26e39-da8c-4acf-8bfb-bb1489553724-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d9f26e39-da8c-4acf-8bfb-bb1489553724" (UID: "d9f26e39-da8c-4acf-8bfb-bb1489553724"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:56:54.485137 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.485078 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "d9f26e39-da8c-4acf-8bfb-bb1489553724" (UID: "d9f26e39-da8c-4acf-8bfb-bb1489553724"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:56:54.485184 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.485150 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "d9f26e39-da8c-4acf-8bfb-bb1489553724" (UID: "d9f26e39-da8c-4acf-8bfb-bb1489553724"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:56:54.486684 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.486663 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f26e39-da8c-4acf-8bfb-bb1489553724-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d9f26e39-da8c-4acf-8bfb-bb1489553724" (UID: "d9f26e39-da8c-4acf-8bfb-bb1489553724"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:56:54.486839 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.486817 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f26e39-da8c-4acf-8bfb-bb1489553724-kube-api-access-tb665" (OuterVolumeSpecName: "kube-api-access-tb665") pod "d9f26e39-da8c-4acf-8bfb-bb1489553724" (UID: "d9f26e39-da8c-4acf-8bfb-bb1489553724"). InnerVolumeSpecName "kube-api-access-tb665". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:56:54.586262 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.586235 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f26e39-da8c-4acf-8bfb-bb1489553724-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:56:54.586262 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.586260 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:56:54.586424 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.586270 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9f26e39-da8c-4acf-8bfb-bb1489553724-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:56:54.586424 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.586281 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tb665\" (UniqueName: \"kubernetes.io/projected/d9f26e39-da8c-4acf-8bfb-bb1489553724-kube-api-access-tb665\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:56:54.586424 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.586290 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d9f26e39-da8c-4acf-8bfb-bb1489553724-cabundle-cert\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:56:54.689596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.689172 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq"] Apr 21 04:56:54.695972 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:54.695946 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rfmkq"] Apr 21 04:56:55.027629 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:55.027604 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" path="/var/lib/kubelet/pods/d9f26e39-da8c-4acf-8bfb-bb1489553724/volumes" Apr 21 04:56:59.381033 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:59.381004 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk_1b95e3a3-85cd-481f-9aa6-6bfe4c86d697/storage-initializer/0.log" Apr 21 04:56:59.381394 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:59.381049 2575 generic.go:358] "Generic (PLEG): container finished" podID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerID="ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2" exitCode=1 Apr 21 04:56:59.381394 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:56:59.381079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" event={"ID":"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697","Type":"ContainerDied","Data":"ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2"} Apr 21 04:57:00.385938 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:00.385907 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk_1b95e3a3-85cd-481f-9aa6-6bfe4c86d697/storage-initializer/0.log" Apr 21 04:57:00.386322 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:00.385974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" event={"ID":"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697","Type":"ContainerStarted","Data":"1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b"} Apr 21 04:57:01.296663 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:01.296632 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk"] Apr 21 04:57:01.388419 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:01.388363 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerName="storage-initializer" containerID="cri-o://1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b" gracePeriod=30 Apr 21 04:57:02.370286 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370252 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2"] Apr 21 04:57:02.370553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370538 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="storage-initializer" Apr 21 04:57:02.370553 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370553 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="storage-initializer" Apr 21 04:57:02.370699 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370567 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kube-rbac-proxy" Apr 21 04:57:02.370699 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370573 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kube-rbac-proxy" Apr 21 04:57:02.370699 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370588 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" Apr 21 04:57:02.370699 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370597 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" Apr 21 04:57:02.370699 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370649 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kube-rbac-proxy" Apr 21 04:57:02.370699 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.370660 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9f26e39-da8c-4acf-8bfb-bb1489553724" containerName="kserve-container" Apr 21 04:57:02.373828 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.373806 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.376146 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.376126 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 21 04:57:02.376385 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.376368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 04:57:02.376473 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.376405 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 21 04:57:02.382089 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.382070 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2"] Apr 21 04:57:02.540299 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.540271 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.540659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.540311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7491fd0a-842c-4776-b578-6c4bc226491a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.540659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.540388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7491fd0a-842c-4776-b578-6c4bc226491a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.540659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.540462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.540659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.540493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnpt5\" (UniqueName: \"kubernetes.io/projected/7491fd0a-842c-4776-b578-6c4bc226491a-kube-api-access-pnpt5\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.641667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.641563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.641667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.641616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnpt5\" (UniqueName: \"kubernetes.io/projected/7491fd0a-842c-4776-b578-6c4bc226491a-kube-api-access-pnpt5\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.641948 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.641678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.641948 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.641703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7491fd0a-842c-4776-b578-6c4bc226491a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.641948 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.641752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7491fd0a-842c-4776-b578-6c4bc226491a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.642158 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.642131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7491fd0a-842c-4776-b578-6c4bc226491a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.642378 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.642355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.642430 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.642391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.644166 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.644145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7491fd0a-842c-4776-b578-6c4bc226491a-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.650275 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.650255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnpt5\" (UniqueName: \"kubernetes.io/projected/7491fd0a-842c-4776-b578-6c4bc226491a-kube-api-access-pnpt5\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.683733 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.683713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:02.798808 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:02.798779 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2"] Apr 21 04:57:02.801331 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:57:02.801304 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7491fd0a_842c_4776_b578_6c4bc226491a.slice/crio-64aa1cb11040d8b6e960c5ab40daabe3f9aceb4f9b4dc233de1516b37435f536 WatchSource:0}: Error finding container 64aa1cb11040d8b6e960c5ab40daabe3f9aceb4f9b4dc233de1516b37435f536: Status 404 returned error can't find the container with id 64aa1cb11040d8b6e960c5ab40daabe3f9aceb4f9b4dc233de1516b37435f536 Apr 21 04:57:03.312947 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.312922 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk_1b95e3a3-85cd-481f-9aa6-6bfe4c86d697/storage-initializer/1.log" Apr 21 04:57:03.313327 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.313313 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk_1b95e3a3-85cd-481f-9aa6-6bfe4c86d697/storage-initializer/0.log" Apr 21 04:57:03.313391 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.313373 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:57:03.394869 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.394807 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk_1b95e3a3-85cd-481f-9aa6-6bfe4c86d697/storage-initializer/1.log" Apr 21 04:57:03.395193 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.395179 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk_1b95e3a3-85cd-481f-9aa6-6bfe4c86d697/storage-initializer/0.log" Apr 21 04:57:03.395255 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.395213 2575 generic.go:358] "Generic (PLEG): container finished" podID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerID="1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b" exitCode=1 Apr 21 04:57:03.395341 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.395318 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" Apr 21 04:57:03.395462 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.395315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" event={"ID":"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697","Type":"ContainerDied","Data":"1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b"} Apr 21 04:57:03.395462 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.395438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk" event={"ID":"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697","Type":"ContainerDied","Data":"275494f0a87b48e54e3810481d978f6bf53109addabdc3996be6b8f509a4c8f8"} Apr 21 04:57:03.395462 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.395462 2575 scope.go:117] "RemoveContainer" containerID="1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b" Apr 21 04:57:03.396737 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.396716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerStarted","Data":"0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc"} Apr 21 04:57:03.396837 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.396743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerStarted","Data":"64aa1cb11040d8b6e960c5ab40daabe3f9aceb4f9b4dc233de1516b37435f536"} Apr 21 04:57:03.403252 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.403233 2575 scope.go:117] "RemoveContainer" containerID="ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2" Apr 21 04:57:03.409946 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.409921 2575 scope.go:117] "RemoveContainer" containerID="1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b" Apr 21 04:57:03.410176 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:57:03.410158 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b\": container with ID starting with 1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b not found: ID does not exist" containerID="1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b" Apr 21 04:57:03.410228 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.410183 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b"} err="failed to get container status \"1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b\": rpc error: code = NotFound desc = could not find container \"1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b\": container with ID starting with 1cd715a18e060f873299da94dd194ae58f1cf1d82ff7e5d8ad24f9035c1f1d8b not found: ID does not exist" Apr 21 04:57:03.410228 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.410201 2575 scope.go:117] "RemoveContainer" containerID="ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2" Apr 21 04:57:03.410452 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:57:03.410431 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2\": container with ID starting with ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2 not found: ID does not exist" containerID="ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2" Apr 21 04:57:03.410503 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.410461 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2"} err="failed to get container status \"ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2\": rpc error: code = NotFound desc = could not find container \"ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2\": container with ID starting with ca7670911fc5d835714ed3ca4572f3ab79a41be0775d4bd4191e9face2c6ebb2 not found: ID does not exist" Apr 21 04:57:03.447630 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.447609 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " Apr 21 04:57:03.447737 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.447647 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls\") pod \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " Apr 21 04:57:03.447737 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.447673 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr75t\" (UniqueName: \"kubernetes.io/projected/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kube-api-access-nr75t\") pod \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " Apr 21 04:57:03.447737 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.447722 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kserve-provision-location\") pod \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\" (UID: \"1b95e3a3-85cd-481f-9aa6-6bfe4c86d697\") " Apr 21 04:57:03.447967 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.447932 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" (UID: "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:57:03.448081 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.448037 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" (UID: "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:57:03.449538 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.449514 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" (UID: "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:57:03.449659 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.449642 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kube-api-access-nr75t" (OuterVolumeSpecName: "kube-api-access-nr75t") pod "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" (UID: "1b95e3a3-85cd-481f-9aa6-6bfe4c86d697"). InnerVolumeSpecName "kube-api-access-nr75t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:57:03.548820 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.548743 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:57:03.548820 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.548796 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:57:03.548820 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.548814 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:57:03.548820 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.548828 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nr75t\" (UniqueName: \"kubernetes.io/projected/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697-kube-api-access-nr75t\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:57:03.738254 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.738219 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk"] Apr 21 04:57:03.742409 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:03.742381 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-mmmqk"] Apr 21 04:57:04.401826 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:04.401795 2575 generic.go:358] "Generic (PLEG): container finished" podID="7491fd0a-842c-4776-b578-6c4bc226491a" containerID="0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc" exitCode=0 Apr 21 04:57:04.401999 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:04.401873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerDied","Data":"0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc"} Apr 21 04:57:05.027975 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:05.027902 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" path="/var/lib/kubelet/pods/1b95e3a3-85cd-481f-9aa6-6bfe4c86d697/volumes" Apr 21 04:57:05.407229 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:05.407193 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerStarted","Data":"8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527"} Apr 21 04:57:05.407229 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:05.407235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerStarted","Data":"e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6"} Apr 21 04:57:05.407476 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:05.407364 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:06.410206 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:06.410176 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:06.411299 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:06.411275 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:57:07.413463 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:07.413426 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:57:12.417618 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:12.417592 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:57:12.418196 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:12.418167 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:57:12.438941 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:12.438900 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podStartSLOduration=10.438889619 podStartE2EDuration="10.438889619s" podCreationTimestamp="2026-04-21 04:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:57:05.426245166 +0000 UTC m=+3611.090835262" watchObservedRunningTime="2026-04-21 04:57:12.438889619 +0000 UTC m=+3618.103479705" Apr 21 04:57:22.418193 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:22.418155 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:57:32.418130 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:32.418092 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:57:42.418136 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:42.418097 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:57:52.419062 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:57:52.419016 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:58:02.418217 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:02.418180 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 21 04:58:12.419295 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:12.419267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:58:22.404900 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:22.404823 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2"] Apr 21 04:58:22.405256 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:22.405142 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" containerID="cri-o://e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6" gracePeriod=30 Apr 21 04:58:22.405256 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:22.405189 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kube-rbac-proxy" containerID="cri-o://8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527" gracePeriod=30 Apr 21 04:58:22.413775 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:22.413736 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.64:8643/healthz\": dial tcp 10.132.0.64:8643: connect: connection refused" Apr 21 04:58:22.618217 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:22.618178 2575 generic.go:358] "Generic (PLEG): container finished" podID="7491fd0a-842c-4776-b578-6c4bc226491a" containerID="8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527" exitCode=2 Apr 21 04:58:22.618368 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:22.618254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerDied","Data":"8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527"} Apr 21 04:58:23.495677 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.495644 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz"] Apr 21 04:58:23.496048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.495928 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerName="storage-initializer" Apr 21 04:58:23.496048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.495939 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerName="storage-initializer" Apr 21 04:58:23.496048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.495948 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerName="storage-initializer" Apr 21 04:58:23.496048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.495953 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerName="storage-initializer" Apr 21 04:58:23.496048 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.496005 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerName="storage-initializer" Apr 21 04:58:23.496215 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.496092 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b95e3a3-85cd-481f-9aa6-6bfe4c86d697" containerName="storage-initializer" Apr 21 04:58:23.498972 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.498954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.501332 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.501315 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 21 04:58:23.501433 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.501365 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 21 04:58:23.509677 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.509654 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz"] Apr 21 04:58:23.600413 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.600385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/5459210d-4eec-49fe-b8a4-05ea604a11fb-kube-api-access-dncsd\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.600521 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.600427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5459210d-4eec-49fe-b8a4-05ea604a11fb-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.600521 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.600483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5459210d-4eec-49fe-b8a4-05ea604a11fb-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.600521 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.600518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5459210d-4eec-49fe-b8a4-05ea604a11fb-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.700969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.700942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5459210d-4eec-49fe-b8a4-05ea604a11fb-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.701115 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.700980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5459210d-4eec-49fe-b8a4-05ea604a11fb-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.701115 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.701030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/5459210d-4eec-49fe-b8a4-05ea604a11fb-kube-api-access-dncsd\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.701115 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.701052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5459210d-4eec-49fe-b8a4-05ea604a11fb-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.701461 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.701434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5459210d-4eec-49fe-b8a4-05ea604a11fb-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.701808 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.701790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5459210d-4eec-49fe-b8a4-05ea604a11fb-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.703296 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.703280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5459210d-4eec-49fe-b8a4-05ea604a11fb-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.710138 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.710115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/5459210d-4eec-49fe-b8a4-05ea604a11fb-kube-api-access-dncsd\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.809172 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.809106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:23.924753 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:23.924720 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz"] Apr 21 04:58:23.927633 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:58:23.927607 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5459210d_4eec_49fe_b8a4_05ea604a11fb.slice/crio-9b8a59aa2ce99cc72d875db8e397fd89aafc19bc72806e6d7c5049c332c5a6bf WatchSource:0}: Error finding container 9b8a59aa2ce99cc72d875db8e397fd89aafc19bc72806e6d7c5049c332c5a6bf: Status 404 returned error can't find the container with id 9b8a59aa2ce99cc72d875db8e397fd89aafc19bc72806e6d7c5049c332c5a6bf Apr 21 04:58:24.625958 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:24.625908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" event={"ID":"5459210d-4eec-49fe-b8a4-05ea604a11fb","Type":"ContainerStarted","Data":"3203dbadb9e0647dddd1844cc01124a16d3b88f81fe6e7a9ee72c0169dd84970"} Apr 21 04:58:24.625958 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:24.625961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" event={"ID":"5459210d-4eec-49fe-b8a4-05ea604a11fb","Type":"ContainerStarted","Data":"9b8a59aa2ce99cc72d875db8e397fd89aafc19bc72806e6d7c5049c332c5a6bf"} Apr 21 04:58:26.346233 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.346211 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:58:26.422832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.422749 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7491fd0a-842c-4776-b578-6c4bc226491a-kserve-provision-location\") pod \"7491fd0a-842c-4776-b578-6c4bc226491a\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " Apr 21 04:58:26.422832 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.422800 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnpt5\" (UniqueName: \"kubernetes.io/projected/7491fd0a-842c-4776-b578-6c4bc226491a-kube-api-access-pnpt5\") pod \"7491fd0a-842c-4776-b578-6c4bc226491a\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " Apr 21 04:58:26.422998 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.422843 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-cabundle-cert\") pod \"7491fd0a-842c-4776-b578-6c4bc226491a\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " Apr 21 04:58:26.422998 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.422864 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7491fd0a-842c-4776-b578-6c4bc226491a-proxy-tls\") pod \"7491fd0a-842c-4776-b578-6c4bc226491a\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " Apr 21 04:58:26.422998 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.422894 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"7491fd0a-842c-4776-b578-6c4bc226491a\" (UID: \"7491fd0a-842c-4776-b578-6c4bc226491a\") " Apr 21 04:58:26.423155 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.423124 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7491fd0a-842c-4776-b578-6c4bc226491a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7491fd0a-842c-4776-b578-6c4bc226491a" (UID: "7491fd0a-842c-4776-b578-6c4bc226491a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:58:26.423251 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.423227 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "7491fd0a-842c-4776-b578-6c4bc226491a" (UID: "7491fd0a-842c-4776-b578-6c4bc226491a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:58:26.423335 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.423312 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "7491fd0a-842c-4776-b578-6c4bc226491a" (UID: "7491fd0a-842c-4776-b578-6c4bc226491a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:58:26.424926 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.424901 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7491fd0a-842c-4776-b578-6c4bc226491a-kube-api-access-pnpt5" (OuterVolumeSpecName: "kube-api-access-pnpt5") pod "7491fd0a-842c-4776-b578-6c4bc226491a" (UID: "7491fd0a-842c-4776-b578-6c4bc226491a"). InnerVolumeSpecName "kube-api-access-pnpt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:58:26.425028 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.425003 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7491fd0a-842c-4776-b578-6c4bc226491a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7491fd0a-842c-4776-b578-6c4bc226491a" (UID: "7491fd0a-842c-4776-b578-6c4bc226491a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:58:26.524316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.524291 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7491fd0a-842c-4776-b578-6c4bc226491a-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:26.524316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.524313 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnpt5\" (UniqueName: \"kubernetes.io/projected/7491fd0a-842c-4776-b578-6c4bc226491a-kube-api-access-pnpt5\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:26.524446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.524322 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-cabundle-cert\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:26.524446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.524333 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7491fd0a-842c-4776-b578-6c4bc226491a-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:26.524446 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.524341 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7491fd0a-842c-4776-b578-6c4bc226491a-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:26.633109 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.633080 2575 generic.go:358] "Generic (PLEG): container finished" podID="7491fd0a-842c-4776-b578-6c4bc226491a" containerID="e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6" exitCode=0 Apr 21 04:58:26.633223 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.633155 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" Apr 21 04:58:26.633223 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.633167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerDied","Data":"e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6"} Apr 21 04:58:26.633223 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.633216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2" event={"ID":"7491fd0a-842c-4776-b578-6c4bc226491a","Type":"ContainerDied","Data":"64aa1cb11040d8b6e960c5ab40daabe3f9aceb4f9b4dc233de1516b37435f536"} Apr 21 04:58:26.633382 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.633240 2575 scope.go:117] "RemoveContainer" containerID="8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527" Apr 21 04:58:26.640538 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.640519 2575 scope.go:117] "RemoveContainer" containerID="e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6" Apr 21 04:58:26.647413 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.647398 2575 scope.go:117] "RemoveContainer" containerID="0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc" Apr 21 04:58:26.654421 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.654407 2575 scope.go:117] "RemoveContainer" containerID="8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527" Apr 21 04:58:26.654673 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:58:26.654650 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527\": container with ID starting with 8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527 not found: ID does not exist" containerID="8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527" Apr 21 04:58:26.654738 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.654686 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527"} err="failed to get container status \"8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527\": rpc error: code = NotFound desc = could not find container \"8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527\": container with ID starting with 8b904c32ede3a70edff65d99f03b0b8458478a4dab4cb3d0c1f0b6beff433527 not found: ID does not exist" Apr 21 04:58:26.654738 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.654711 2575 scope.go:117] "RemoveContainer" containerID="e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6" Apr 21 04:58:26.654934 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.654916 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2"] Apr 21 04:58:26.654988 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:58:26.654966 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6\": container with ID starting with e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6 not found: ID does not exist" containerID="e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6" Apr 21 04:58:26.655025 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.654991 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6"} err="failed to get container status \"e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6\": rpc error: code = NotFound desc = could not find container \"e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6\": container with ID starting with e698c117a4f16d4d0a13805c90ce8792c563397b8a118eab8fa0a5896e7eaab6 not found: ID does not exist" Apr 21 04:58:26.655025 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.655008 2575 scope.go:117] "RemoveContainer" containerID="0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc" Apr 21 04:58:26.655255 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:58:26.655237 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc\": container with ID starting with 0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc not found: ID does not exist" containerID="0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc" Apr 21 04:58:26.655314 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.655262 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc"} err="failed to get container status \"0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc\": rpc error: code = NotFound desc = could not find container \"0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc\": container with ID starting with 0385b847424e76a8894c8fb2dec33f039df596731292a760bd370420fd3810bc not found: ID does not exist" Apr 21 04:58:26.658063 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:26.658042 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-ftbl2"] Apr 21 04:58:27.027834 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:27.027801 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" path="/var/lib/kubelet/pods/7491fd0a-842c-4776-b578-6c4bc226491a/volumes" Apr 21 04:58:29.642752 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:29.642726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz_5459210d-4eec-49fe-b8a4-05ea604a11fb/storage-initializer/0.log" Apr 21 04:58:29.643189 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:29.642775 2575 generic.go:358] "Generic (PLEG): container finished" podID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerID="3203dbadb9e0647dddd1844cc01124a16d3b88f81fe6e7a9ee72c0169dd84970" exitCode=1 Apr 21 04:58:29.643189 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:29.642811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" event={"ID":"5459210d-4eec-49fe-b8a4-05ea604a11fb","Type":"ContainerDied","Data":"3203dbadb9e0647dddd1844cc01124a16d3b88f81fe6e7a9ee72c0169dd84970"} Apr 21 04:58:30.646982 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:30.646946 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz_5459210d-4eec-49fe-b8a4-05ea604a11fb/storage-initializer/0.log" Apr 21 04:58:30.647432 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:30.646996 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" event={"ID":"5459210d-4eec-49fe-b8a4-05ea604a11fb","Type":"ContainerStarted","Data":"b6d9078f5d77076036e76471d710311fd6ad7bda015a4e1ec2d7172d4992b6dd"} Apr 21 04:58:33.493962 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:33.493931 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz"] Apr 21 04:58:33.494459 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:33.494294 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerName="storage-initializer" containerID="cri-o://b6d9078f5d77076036e76471d710311fd6ad7bda015a4e1ec2d7172d4992b6dd" gracePeriod=30 Apr 21 04:58:34.559654 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.559627 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4"] Apr 21 04:58:34.560001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.559907 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kube-rbac-proxy" Apr 21 04:58:34.560001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.559918 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kube-rbac-proxy" Apr 21 04:58:34.560001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.559931 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="storage-initializer" Apr 21 04:58:34.560001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.559937 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="storage-initializer" Apr 21 04:58:34.560001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.559945 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" Apr 21 04:58:34.560001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.559951 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" Apr 21 04:58:34.560001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.560003 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kube-rbac-proxy" Apr 21 04:58:34.560226 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.560013 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7491fd0a-842c-4776-b578-6c4bc226491a" containerName="kserve-container" Apr 21 04:58:34.563009 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.562991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.565377 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.565356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 21 04:58:34.565377 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.565368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 04:58:34.565587 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.565574 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 21 04:58:34.574897 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.574878 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4"] Apr 21 04:58:34.658203 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.658180 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz_5459210d-4eec-49fe-b8a4-05ea604a11fb/storage-initializer/1.log" Apr 21 04:58:34.659316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.658605 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz_5459210d-4eec-49fe-b8a4-05ea604a11fb/storage-initializer/0.log" Apr 21 04:58:34.659316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.658642 2575 generic.go:358] "Generic (PLEG): container finished" podID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerID="b6d9078f5d77076036e76471d710311fd6ad7bda015a4e1ec2d7172d4992b6dd" exitCode=1 Apr 21 04:58:34.659316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.658776 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" event={"ID":"5459210d-4eec-49fe-b8a4-05ea604a11fb","Type":"ContainerDied","Data":"b6d9078f5d77076036e76471d710311fd6ad7bda015a4e1ec2d7172d4992b6dd"} Apr 21 04:58:34.659316 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.658814 2575 scope.go:117] "RemoveContainer" containerID="3203dbadb9e0647dddd1844cc01124a16d3b88f81fe6e7a9ee72c0169dd84970" Apr 21 04:58:34.682384 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.682353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.682528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.682430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.682528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.682471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.682528 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.682501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448tb\" (UniqueName: \"kubernetes.io/projected/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kube-api-access-448tb\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.682717 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.682532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.725968 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.725949 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz_5459210d-4eec-49fe-b8a4-05ea604a11fb/storage-initializer/1.log" Apr 21 04:58:34.726072 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.726012 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:34.782889 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.782829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.782889 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.782875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.783034 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.782900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.783034 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.782919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-448tb\" (UniqueName: \"kubernetes.io/projected/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kube-api-access-448tb\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.783034 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.782944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.783350 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.783317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.783473 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.783450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.783582 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.783560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.785170 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.785152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.791277 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.791257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-448tb\" (UniqueName: \"kubernetes.io/projected/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kube-api-access-448tb\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.876492 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.876473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:34.883783 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.883600 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5459210d-4eec-49fe-b8a4-05ea604a11fb-proxy-tls\") pod \"5459210d-4eec-49fe-b8a4-05ea604a11fb\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " Apr 21 04:58:34.883783 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.883634 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/5459210d-4eec-49fe-b8a4-05ea604a11fb-kube-api-access-dncsd\") pod \"5459210d-4eec-49fe-b8a4-05ea604a11fb\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " Apr 21 04:58:34.883783 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.883666 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5459210d-4eec-49fe-b8a4-05ea604a11fb-kserve-provision-location\") pod \"5459210d-4eec-49fe-b8a4-05ea604a11fb\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " Apr 21 04:58:34.883783 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.883707 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5459210d-4eec-49fe-b8a4-05ea604a11fb-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"5459210d-4eec-49fe-b8a4-05ea604a11fb\" (UID: \"5459210d-4eec-49fe-b8a4-05ea604a11fb\") " Apr 21 04:58:34.884024 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.884000 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5459210d-4eec-49fe-b8a4-05ea604a11fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5459210d-4eec-49fe-b8a4-05ea604a11fb" (UID: "5459210d-4eec-49fe-b8a4-05ea604a11fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:58:34.884135 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.884084 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5459210d-4eec-49fe-b8a4-05ea604a11fb-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "5459210d-4eec-49fe-b8a4-05ea604a11fb" (UID: "5459210d-4eec-49fe-b8a4-05ea604a11fb"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:58:34.885496 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.885476 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5459210d-4eec-49fe-b8a4-05ea604a11fb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5459210d-4eec-49fe-b8a4-05ea604a11fb" (UID: "5459210d-4eec-49fe-b8a4-05ea604a11fb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:58:34.885596 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.885579 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5459210d-4eec-49fe-b8a4-05ea604a11fb-kube-api-access-dncsd" (OuterVolumeSpecName: "kube-api-access-dncsd") pod "5459210d-4eec-49fe-b8a4-05ea604a11fb" (UID: "5459210d-4eec-49fe-b8a4-05ea604a11fb"). InnerVolumeSpecName "kube-api-access-dncsd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:58:34.984750 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.984725 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5459210d-4eec-49fe-b8a4-05ea604a11fb-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:34.984750 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.984748 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/5459210d-4eec-49fe-b8a4-05ea604a11fb-kube-api-access-dncsd\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:34.984930 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.984780 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5459210d-4eec-49fe-b8a4-05ea604a11fb-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:34.984930 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.984793 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5459210d-4eec-49fe-b8a4-05ea604a11fb-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:58:34.991199 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:34.991169 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4"] Apr 21 04:58:34.994291 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:58:34.994269 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec1ebde_eb3a_4c1a_8e08_314862b6bb22.slice/crio-c8253c81a56ae3aa12f238c98ac692fc33663381e27aa42de51ef5744170ad79 WatchSource:0}: Error finding container c8253c81a56ae3aa12f238c98ac692fc33663381e27aa42de51ef5744170ad79: Status 404 returned error can't find the container with id c8253c81a56ae3aa12f238c98ac692fc33663381e27aa42de51ef5744170ad79 Apr 21 04:58:35.662350 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.662301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerStarted","Data":"96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8"} Apr 21 04:58:35.662350 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.662349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerStarted","Data":"c8253c81a56ae3aa12f238c98ac692fc33663381e27aa42de51ef5744170ad79"} Apr 21 04:58:35.663434 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.663412 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz_5459210d-4eec-49fe-b8a4-05ea604a11fb/storage-initializer/1.log" Apr 21 04:58:35.663534 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.663506 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" event={"ID":"5459210d-4eec-49fe-b8a4-05ea604a11fb","Type":"ContainerDied","Data":"9b8a59aa2ce99cc72d875db8e397fd89aafc19bc72806e6d7c5049c332c5a6bf"} Apr 21 04:58:35.663534 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.663529 2575 scope.go:117] "RemoveContainer" containerID="b6d9078f5d77076036e76471d710311fd6ad7bda015a4e1ec2d7172d4992b6dd" Apr 21 04:58:35.663640 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.663539 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz" Apr 21 04:58:35.706270 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.706243 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz"] Apr 21 04:58:35.709712 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:35.709690 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-b5cdz"] Apr 21 04:58:36.667671 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:36.667635 2575 generic.go:358] "Generic (PLEG): container finished" podID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerID="96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8" exitCode=0 Apr 21 04:58:36.668152 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:36.667707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerDied","Data":"96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8"} Apr 21 04:58:37.028043 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:37.027967 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" path="/var/lib/kubelet/pods/5459210d-4eec-49fe-b8a4-05ea604a11fb/volumes" Apr 21 04:58:37.672852 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:37.672819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerStarted","Data":"85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923"} Apr 21 04:58:37.672852 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:37.672853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerStarted","Data":"912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4"} Apr 21 04:58:37.673254 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:37.672985 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:37.694655 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:37.694615 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podStartSLOduration=3.69460268 podStartE2EDuration="3.69460268s" podCreationTimestamp="2026-04-21 04:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:58:37.692254779 +0000 UTC m=+3703.356844865" watchObservedRunningTime="2026-04-21 04:58:37.69460268 +0000 UTC m=+3703.359192765" Apr 21 04:58:38.675946 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:38.675915 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:38.677199 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:38.677165 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:58:39.678384 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:39.678340 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:58:44.682499 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:44.682464 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:58:44.683018 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:44.682990 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:58:54.683218 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:58:54.683181 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:59:04.683050 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:04.683009 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:59:14.683619 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:14.683576 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:59:24.683194 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:24.683146 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:59:34.683667 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:34.683624 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:59:44.684478 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:44.684397 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:59:54.595810 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:54.595777 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4"] Apr 21 04:59:54.596266 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:54.596091 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" containerID="cri-o://912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4" gracePeriod=30 Apr 21 04:59:54.596266 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:54.596117 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kube-rbac-proxy" containerID="cri-o://85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923" gracePeriod=30 Apr 21 04:59:54.678666 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:54.678631 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.66:8643/healthz\": dial tcp 10.132.0.66:8643: connect: connection refused" Apr 21 04:59:54.683408 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:54.683384 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 21 04:59:54.884771 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:54.884724 2575 generic.go:358] "Generic (PLEG): container finished" podID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerID="85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923" exitCode=2 Apr 21 04:59:54.884881 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:54.884788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerDied","Data":"85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923"} Apr 21 04:59:55.668041 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.668003 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs"] Apr 21 04:59:55.668508 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.668429 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerName="storage-initializer" Apr 21 04:59:55.668508 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.668448 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerName="storage-initializer" Apr 21 04:59:55.668508 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.668472 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerName="storage-initializer" Apr 21 04:59:55.668508 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.668479 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerName="storage-initializer" Apr 21 04:59:55.668731 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.668538 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerName="storage-initializer" Apr 21 04:59:55.668731 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.668675 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5459210d-4eec-49fe-b8a4-05ea604a11fb" containerName="storage-initializer" Apr 21 04:59:55.671481 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.671458 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.673769 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.673733 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 21 04:59:55.673886 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.673778 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 21 04:59:55.679597 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.679576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs"] Apr 21 04:59:55.748967 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.748936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.749118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.748999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.749118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.749022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzcrk\" (UniqueName: \"kubernetes.io/projected/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kube-api-access-bzcrk\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.749118 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.749047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.850212 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.850184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.850329 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.850232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.850329 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:59:55.850315 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 21 04:59:55.850430 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.850348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzcrk\" (UniqueName: \"kubernetes.io/projected/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kube-api-access-bzcrk\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.850430 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:59:55.850374 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls podName:c848fbfb-4c20-4e70-96cf-864b89b6ee3a nodeName:}" failed. No retries permitted until 2026-04-21 04:59:56.35035733 +0000 UTC m=+3782.014947394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" (UID: "c848fbfb-4c20-4e70-96cf-864b89b6ee3a") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 21 04:59:55.850430 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.850405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.850701 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.850684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.850851 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.850831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:55.860837 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:55.860816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzcrk\" (UniqueName: \"kubernetes.io/projected/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kube-api-access-bzcrk\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:56.355398 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:56.355366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:56.357676 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:56.357644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:56.582566 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:56.582532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 04:59:56.697865 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:56.697835 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs"] Apr 21 04:59:56.701362 ip-10-0-134-15 kubenswrapper[2575]: W0421 04:59:56.701332 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc848fbfb_4c20_4e70_96cf_864b89b6ee3a.slice/crio-8ae686f11b05b250904c5454d38144930638342c5efdc21adbe65202a09f422e WatchSource:0}: Error finding container 8ae686f11b05b250904c5454d38144930638342c5efdc21adbe65202a09f422e: Status 404 returned error can't find the container with id 8ae686f11b05b250904c5454d38144930638342c5efdc21adbe65202a09f422e Apr 21 04:59:56.703134 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:56.703119 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:59:56.891408 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:56.891328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" event={"ID":"c848fbfb-4c20-4e70-96cf-864b89b6ee3a","Type":"ContainerStarted","Data":"4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef"} Apr 21 04:59:56.891408 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:56.891366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" event={"ID":"c848fbfb-4c20-4e70-96cf-864b89b6ee3a","Type":"ContainerStarted","Data":"8ae686f11b05b250904c5454d38144930638342c5efdc21adbe65202a09f422e"} Apr 21 04:59:58.528723 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.528701 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:59:58.572487 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.572461 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-proxy-tls\") pod \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " Apr 21 04:59:58.572585 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.572491 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448tb\" (UniqueName: \"kubernetes.io/projected/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kube-api-access-448tb\") pod \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " Apr 21 04:59:58.572585 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.572515 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " Apr 21 04:59:58.572662 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.572635 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kserve-provision-location\") pod \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " Apr 21 04:59:58.572694 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.572679 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-cabundle-cert\") pod \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\" (UID: \"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22\") " Apr 21 04:59:58.572969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.572943 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" (UID: "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:59:58.572969 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.572960 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" (UID: "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:59:58.573106 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.573033 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" (UID: "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:59:58.574657 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.574639 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" (UID: "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:59:58.574720 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.574659 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kube-api-access-448tb" (OuterVolumeSpecName: "kube-api-access-448tb") pod "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" (UID: "9ec1ebde-eb3a-4c1a-8e08-314862b6bb22"). InnerVolumeSpecName "kube-api-access-448tb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:59:58.674172 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.674150 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:59:58.674172 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.674171 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-448tb\" (UniqueName: \"kubernetes.io/projected/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kube-api-access-448tb\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:59:58.674342 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.674193 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:59:58.674342 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.674205 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:59:58.674342 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.674214 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22-cabundle-cert\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 04:59:58.898899 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.898874 2575 generic.go:358] "Generic (PLEG): container finished" podID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerID="912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4" exitCode=0 Apr 21 04:59:58.899019 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.898953 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" Apr 21 04:59:58.899066 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.898936 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerDied","Data":"912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4"} Apr 21 04:59:58.899066 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.899054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4" event={"ID":"9ec1ebde-eb3a-4c1a-8e08-314862b6bb22","Type":"ContainerDied","Data":"c8253c81a56ae3aa12f238c98ac692fc33663381e27aa42de51ef5744170ad79"} Apr 21 04:59:58.899134 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.899072 2575 scope.go:117] "RemoveContainer" containerID="85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923" Apr 21 04:59:58.910006 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.909986 2575 scope.go:117] "RemoveContainer" containerID="912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4" Apr 21 04:59:58.917192 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.917177 2575 scope.go:117] "RemoveContainer" containerID="96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8" Apr 21 04:59:58.920116 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.920094 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4"] Apr 21 04:59:58.924616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.924488 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-pfjm4"] Apr 21 04:59:58.924616 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.924602 2575 scope.go:117] "RemoveContainer" containerID="85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923" Apr 21 04:59:58.924864 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:59:58.924845 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923\": container with ID starting with 85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923 not found: ID does not exist" containerID="85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923" Apr 21 04:59:58.924913 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.924871 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923"} err="failed to get container status \"85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923\": rpc error: code = NotFound desc = could not find container \"85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923\": container with ID starting with 85966525a28e34c137f0e8e792e7bc4f582d7f1d136d48c2b5efefafa9635923 not found: ID does not exist" Apr 21 04:59:58.924913 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.924887 2575 scope.go:117] "RemoveContainer" containerID="912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4" Apr 21 04:59:58.925104 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:59:58.925086 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4\": container with ID starting with 912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4 not found: ID does not exist" containerID="912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4" Apr 21 04:59:58.925143 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.925110 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4"} err="failed to get container status \"912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4\": rpc error: code = NotFound desc = could not find container \"912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4\": container with ID starting with 912542c01aa758de8c570705f61fb8fd42b83acb3dbbf603058953fd1262cae4 not found: ID does not exist" Apr 21 04:59:58.925143 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.925125 2575 scope.go:117] "RemoveContainer" containerID="96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8" Apr 21 04:59:58.925331 ip-10-0-134-15 kubenswrapper[2575]: E0421 04:59:58.925315 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8\": container with ID starting with 96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8 not found: ID does not exist" containerID="96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8" Apr 21 04:59:58.925374 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:58.925336 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8"} err="failed to get container status \"96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8\": rpc error: code = NotFound desc = could not find container \"96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8\": container with ID starting with 96983ebfe02d2a110fa4da5628f826efe96fc8ed560b464351038b5a2c6b9ed8 not found: ID does not exist" Apr 21 04:59:59.027001 ip-10-0-134-15 kubenswrapper[2575]: I0421 04:59:59.026980 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" path="/var/lib/kubelet/pods/9ec1ebde-eb3a-4c1a-8e08-314862b6bb22/volumes" Apr 21 05:00:00.906740 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:00.906710 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_c848fbfb-4c20-4e70-96cf-864b89b6ee3a/storage-initializer/0.log" Apr 21 05:00:00.907150 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:00.906750 2575 generic.go:358] "Generic (PLEG): container finished" podID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerID="4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef" exitCode=1 Apr 21 05:00:00.907150 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:00.906814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" event={"ID":"c848fbfb-4c20-4e70-96cf-864b89b6ee3a","Type":"ContainerDied","Data":"4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef"} Apr 21 05:00:01.911500 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:01.911473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_c848fbfb-4c20-4e70-96cf-864b89b6ee3a/storage-initializer/0.log" Apr 21 05:00:01.911910 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:01.911569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" event={"ID":"c848fbfb-4c20-4e70-96cf-864b89b6ee3a","Type":"ContainerStarted","Data":"d0a0188ef90d9d2a88aaced3de3789ac1c5aded137721ba29d4339b5be9e2f9b"} Apr 21 05:00:04.919657 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:04.919637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_c848fbfb-4c20-4e70-96cf-864b89b6ee3a/storage-initializer/1.log" Apr 21 05:00:04.919999 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:04.919984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_c848fbfb-4c20-4e70-96cf-864b89b6ee3a/storage-initializer/0.log" Apr 21 05:00:04.920060 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:04.920017 2575 generic.go:358] "Generic (PLEG): container finished" podID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerID="d0a0188ef90d9d2a88aaced3de3789ac1c5aded137721ba29d4339b5be9e2f9b" exitCode=1 Apr 21 05:00:04.920100 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:04.920059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" event={"ID":"c848fbfb-4c20-4e70-96cf-864b89b6ee3a","Type":"ContainerDied","Data":"d0a0188ef90d9d2a88aaced3de3789ac1c5aded137721ba29d4339b5be9e2f9b"} Apr 21 05:00:04.920100 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:04.920085 2575 scope.go:117] "RemoveContainer" containerID="4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef" Apr 21 05:00:04.920502 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:04.920469 2575 scope.go:117] "RemoveContainer" containerID="4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef" Apr 21 05:00:04.930909 ip-10-0-134-15 kubenswrapper[2575]: E0421 05:00:04.930878 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_kserve-ci-e2e-test_c848fbfb-4c20-4e70-96cf-864b89b6ee3a_0 in pod sandbox 8ae686f11b05b250904c5454d38144930638342c5efdc21adbe65202a09f422e from index: no such id: '4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef'" containerID="4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef" Apr 21 05:00:04.930986 ip-10-0-134-15 kubenswrapper[2575]: E0421 05:00:04.930926 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_kserve-ci-e2e-test_c848fbfb-4c20-4e70-96cf-864b89b6ee3a_0 in pod sandbox 8ae686f11b05b250904c5454d38144930638342c5efdc21adbe65202a09f422e from index: no such id: '4b3df4df0cd4e528713becb5c446aa145765fd10fd9a4eeb23ce3856ddcfa3ef'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_kserve-ci-e2e-test(c848fbfb-4c20-4e70-96cf-864b89b6ee3a)\"" logger="UnhandledError" Apr 21 05:00:04.932213 ip-10-0-134-15 kubenswrapper[2575]: E0421 05:00:04.932192 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_kserve-ci-e2e-test(c848fbfb-4c20-4e70-96cf-864b89b6ee3a)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" Apr 21 05:00:05.660888 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:05.660859 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs"] Apr 21 05:00:05.924211 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:05.924141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_c848fbfb-4c20-4e70-96cf-864b89b6ee3a/storage-initializer/1.log" Apr 21 05:00:06.051654 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.051636 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_c848fbfb-4c20-4e70-96cf-864b89b6ee3a/storage-initializer/1.log" Apr 21 05:00:06.051745 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.051700 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 05:00:06.125087 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125060 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kserve-provision-location\") pod \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " Apr 21 05:00:06.125194 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125114 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " Apr 21 05:00:06.125194 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125167 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzcrk\" (UniqueName: \"kubernetes.io/projected/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kube-api-access-bzcrk\") pod \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " Apr 21 05:00:06.125313 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125205 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls\") pod \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\" (UID: \"c848fbfb-4c20-4e70-96cf-864b89b6ee3a\") " Apr 21 05:00:06.125373 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125295 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c848fbfb-4c20-4e70-96cf-864b89b6ee3a" (UID: "c848fbfb-4c20-4e70-96cf-864b89b6ee3a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 05:00:06.125487 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125439 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "c848fbfb-4c20-4e70-96cf-864b89b6ee3a" (UID: "c848fbfb-4c20-4e70-96cf-864b89b6ee3a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 05:00:06.125712 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125688 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kserve-provision-location\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 05:00:06.125853 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.125834 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 05:00:06.127186 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.127165 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c848fbfb-4c20-4e70-96cf-864b89b6ee3a" (UID: "c848fbfb-4c20-4e70-96cf-864b89b6ee3a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 05:00:06.127374 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.127344 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kube-api-access-bzcrk" (OuterVolumeSpecName: "kube-api-access-bzcrk") pod "c848fbfb-4c20-4e70-96cf-864b89b6ee3a" (UID: "c848fbfb-4c20-4e70-96cf-864b89b6ee3a"). InnerVolumeSpecName "kube-api-access-bzcrk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 05:00:06.231565 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.227593 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzcrk\" (UniqueName: \"kubernetes.io/projected/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-kube-api-access-bzcrk\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 05:00:06.231565 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.227623 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c848fbfb-4c20-4e70-96cf-864b89b6ee3a-proxy-tls\") on node \"ip-10-0-134-15.ec2.internal\" DevicePath \"\"" Apr 21 05:00:06.928000 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.927975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs_c848fbfb-4c20-4e70-96cf-864b89b6ee3a/storage-initializer/1.log" Apr 21 05:00:06.928378 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.928059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" event={"ID":"c848fbfb-4c20-4e70-96cf-864b89b6ee3a","Type":"ContainerDied","Data":"8ae686f11b05b250904c5454d38144930638342c5efdc21adbe65202a09f422e"} Apr 21 05:00:06.928378 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.928098 2575 scope.go:117] "RemoveContainer" containerID="d0a0188ef90d9d2a88aaced3de3789ac1c5aded137721ba29d4339b5be9e2f9b" Apr 21 05:00:06.928378 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.928113 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs" Apr 21 05:00:06.962901 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.962876 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs"] Apr 21 05:00:06.968372 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:06.968351 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-j6nbs"] Apr 21 05:00:07.027343 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:07.027310 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" path="/var/lib/kubelet/pods/c848fbfb-4c20-4e70-96cf-864b89b6ee3a/volumes" Apr 21 05:00:34.830546 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830513 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vqvxw/must-gather-m6gvz"] Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830811 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830824 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830834 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kube-rbac-proxy" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830839 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kube-rbac-proxy" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830847 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830854 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830866 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830871 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830878 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830883 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830931 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830939 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c848fbfb-4c20-4e70-96cf-864b89b6ee3a" containerName="storage-initializer" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830945 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kube-rbac-proxy" Apr 21 05:00:34.830984 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.830952 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ec1ebde-eb3a-4c1a-8e08-314862b6bb22" containerName="kserve-container" Apr 21 05:00:34.833777 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.833748 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:34.836249 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.836221 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqvxw\"/\"openshift-service-ca.crt\"" Apr 21 05:00:34.836249 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.836223 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vqvxw\"/\"default-dockercfg-vstpr\"" Apr 21 05:00:34.836436 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.836377 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqvxw\"/\"kube-root-ca.crt\"" Apr 21 05:00:34.840303 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.840276 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/must-gather-m6gvz"] Apr 21 05:00:34.917660 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.917632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec680e37-a305-4290-a924-b4cb43e0f738-must-gather-output\") pod \"must-gather-m6gvz\" (UID: \"ec680e37-a305-4290-a924-b4cb43e0f738\") " pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:34.917660 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:34.917662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn7lt\" (UniqueName: \"kubernetes.io/projected/ec680e37-a305-4290-a924-b4cb43e0f738-kube-api-access-rn7lt\") pod \"must-gather-m6gvz\" (UID: \"ec680e37-a305-4290-a924-b4cb43e0f738\") " pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:35.018121 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:35.018097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec680e37-a305-4290-a924-b4cb43e0f738-must-gather-output\") pod \"must-gather-m6gvz\" (UID: \"ec680e37-a305-4290-a924-b4cb43e0f738\") " pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:35.018231 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:35.018123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn7lt\" (UniqueName: \"kubernetes.io/projected/ec680e37-a305-4290-a924-b4cb43e0f738-kube-api-access-rn7lt\") pod \"must-gather-m6gvz\" (UID: \"ec680e37-a305-4290-a924-b4cb43e0f738\") " pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:35.018409 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:35.018391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec680e37-a305-4290-a924-b4cb43e0f738-must-gather-output\") pod \"must-gather-m6gvz\" (UID: \"ec680e37-a305-4290-a924-b4cb43e0f738\") " pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:35.026238 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:35.026216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn7lt\" (UniqueName: \"kubernetes.io/projected/ec680e37-a305-4290-a924-b4cb43e0f738-kube-api-access-rn7lt\") pod \"must-gather-m6gvz\" (UID: \"ec680e37-a305-4290-a924-b4cb43e0f738\") " pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:35.143311 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:35.143294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/must-gather-m6gvz" Apr 21 05:00:35.254146 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:35.254115 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/must-gather-m6gvz"] Apr 21 05:00:35.257342 ip-10-0-134-15 kubenswrapper[2575]: W0421 05:00:35.257309 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec680e37_a305_4290_a924_b4cb43e0f738.slice/crio-03728b35f59f33c0529568533e2353564cb81ddb97b462be4509ec8a6f97d90e WatchSource:0}: Error finding container 03728b35f59f33c0529568533e2353564cb81ddb97b462be4509ec8a6f97d90e: Status 404 returned error can't find the container with id 03728b35f59f33c0529568533e2353564cb81ddb97b462be4509ec8a6f97d90e Apr 21 05:00:36.007802 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:36.007744 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/must-gather-m6gvz" event={"ID":"ec680e37-a305-4290-a924-b4cb43e0f738","Type":"ContainerStarted","Data":"03728b35f59f33c0529568533e2353564cb81ddb97b462be4509ec8a6f97d90e"} Apr 21 05:00:37.012723 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:37.012684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/must-gather-m6gvz" event={"ID":"ec680e37-a305-4290-a924-b4cb43e0f738","Type":"ContainerStarted","Data":"effa2f17ad3a2674d926b74549489ad8d8bb1c61719a9fd99e94a5922c6b3da1"} Apr 21 05:00:37.012723 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:37.012724 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/must-gather-m6gvz" event={"ID":"ec680e37-a305-4290-a924-b4cb43e0f738","Type":"ContainerStarted","Data":"b602c35154dd767ed3a83a0281bde2b53da11479fb9d9d3013be08402078b82e"} Apr 21 05:00:37.027870 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:37.027635 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vqvxw/must-gather-m6gvz" podStartSLOduration=2.130693043 podStartE2EDuration="3.027617902s" podCreationTimestamp="2026-04-21 05:00:34 +0000 UTC" firstStartedPulling="2026-04-21 05:00:35.259053012 +0000 UTC m=+3820.923643076" lastFinishedPulling="2026-04-21 05:00:36.155977872 +0000 UTC m=+3821.820567935" observedRunningTime="2026-04-21 05:00:37.026949987 +0000 UTC m=+3822.691540072" watchObservedRunningTime="2026-04-21 05:00:37.027617902 +0000 UTC m=+3822.692207993" Apr 21 05:00:37.689812 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:37.689780 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lc86l_9b00db34-9aa2-4d7a-82d3-34c2ded3bcd3/global-pull-secret-syncer/0.log" Apr 21 05:00:37.827358 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:37.827330 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6w2n8_332698b5-e816-48f5-806f-295e3ed3f8fb/konnectivity-agent/0.log" Apr 21 05:00:37.977997 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:37.977929 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-15.ec2.internal_cb235a448aa33328082b3f09d40d9023/haproxy/0.log" Apr 21 05:00:41.609023 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:41.608988 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4hlnc_39d1ef11-133c-4565-a940-89e38405f4e3/node-exporter/0.log" Apr 21 05:00:41.631387 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:41.631358 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4hlnc_39d1ef11-133c-4565-a940-89e38405f4e3/kube-rbac-proxy/0.log" Apr 21 05:00:41.655829 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:41.655800 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4hlnc_39d1ef11-133c-4565-a940-89e38405f4e3/init-textfile/0.log" Apr 21 05:00:44.892863 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:44.892831 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9"] Apr 21 05:00:44.897057 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:44.897029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:44.902626 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:44.902602 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9"] Apr 21 05:00:45.005733 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.005703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqdzl\" (UniqueName: \"kubernetes.io/projected/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-kube-api-access-kqdzl\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.005936 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.005746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-sys\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.005936 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.005831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-podres\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.006041 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.005938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-proc\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.006041 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.005969 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-lib-modules\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.106599 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.106564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-lib-modules\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.106926 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.106901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqdzl\" (UniqueName: \"kubernetes.io/projected/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-kube-api-access-kqdzl\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.107104 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.107087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-sys\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.107337 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.107320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-podres\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.107495 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.107470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-podres\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.107495 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.107495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-proc\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.107495 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.107263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-sys\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.107663 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.106905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-lib-modules\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.107663 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.107576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-proc\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.115071 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.115043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqdzl\" (UniqueName: \"kubernetes.io/projected/0e10bc23-3958-4696-9dc8-e7aa5b6049b5-kube-api-access-kqdzl\") pod \"perf-node-gather-daemonset-8qsg9\" (UID: \"0e10bc23-3958-4696-9dc8-e7aa5b6049b5\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.210089 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.210021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:45.344431 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.344399 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9"] Apr 21 05:00:45.347617 ip-10-0-134-15 kubenswrapper[2575]: W0421 05:00:45.347568 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e10bc23_3958_4696_9dc8_e7aa5b6049b5.slice/crio-3a59a11fefa46929dd2f4b1d8b99915aaa4539954fff43475f4ef3297d91e901 WatchSource:0}: Error finding container 3a59a11fefa46929dd2f4b1d8b99915aaa4539954fff43475f4ef3297d91e901: Status 404 returned error can't find the container with id 3a59a11fefa46929dd2f4b1d8b99915aaa4539954fff43475f4ef3297d91e901 Apr 21 05:00:45.488039 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.487980 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hz28z_a7fc9465-0576-4a91-ba4c-913400d12eb3/dns/0.log" Apr 21 05:00:45.506774 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.506729 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hz28z_a7fc9465-0576-4a91-ba4c-913400d12eb3/kube-rbac-proxy/0.log" Apr 21 05:00:45.529776 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:45.529731 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j8nd9_89bfd36f-311d-477d-a1ba-cc9a2854b55d/dns-node-resolver/0.log" Apr 21 05:00:46.057798 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:46.057738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" event={"ID":"0e10bc23-3958-4696-9dc8-e7aa5b6049b5","Type":"ContainerStarted","Data":"e5ed7e450710f248149cdf8631c285ee686f7cbaf7a477ec405a5ba1f5d64f8c"} Apr 21 05:00:46.058154 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:46.057804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" event={"ID":"0e10bc23-3958-4696-9dc8-e7aa5b6049b5","Type":"ContainerStarted","Data":"3a59a11fefa46929dd2f4b1d8b99915aaa4539954fff43475f4ef3297d91e901"} Apr 21 05:00:46.058154 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:46.057823 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:46.064368 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:46.064340 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tqkpr_48670ed3-db9b-4299-a9b9-270bf4c32561/node-ca/0.log" Apr 21 05:00:46.073994 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:46.073940 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" podStartSLOduration=2.073921757 podStartE2EDuration="2.073921757s" podCreationTimestamp="2026-04-21 05:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 05:00:46.073345688 +0000 UTC m=+3831.737935775" watchObservedRunningTime="2026-04-21 05:00:46.073921757 +0000 UTC m=+3831.738511842" Apr 21 05:00:47.086440 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:47.086412 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6kgc9_8b7373a9-03be-47d1-9a03-d92ce2e99f2a/serve-healthcheck-canary/0.log" Apr 21 05:00:47.637067 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:47.637038 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s9nkw_bda59d05-4057-4aab-ae91-a860d3e62ba1/kube-rbac-proxy/0.log" Apr 21 05:00:47.656599 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:47.656571 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s9nkw_bda59d05-4057-4aab-ae91-a860d3e62ba1/exporter/0.log" Apr 21 05:00:47.677814 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:47.677793 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s9nkw_bda59d05-4057-4aab-ae91-a860d3e62ba1/extractor/0.log" Apr 21 05:00:49.645980 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:49.645938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-9zk6d_1e07702c-3b11-4397-a13a-e55dc0e6cc99/server/0.log" Apr 21 05:00:49.891186 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:49.891154 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-mbzsg_3dba583d-6ff8-4afe-9529-9ad2120fa3e2/manager/0.log" Apr 21 05:00:49.978287 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:49.978215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-jhrnv_d7a895f7-adb0-4db2-9739-b92a466458f7/seaweedfs/0.log" Apr 21 05:00:50.000042 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:50.000021 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-94wk8_1ad35793-a530-4240-9d36-eeca50239573/seaweedfs-tls-custom/0.log" Apr 21 05:00:50.022122 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:50.022098 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-sxjd7_78d45106-756d-4814-b4a2-020d8e60ccb9/seaweedfs-tls-serving/0.log" Apr 21 05:00:52.069681 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:52.069647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-8qsg9" Apr 21 05:00:55.431454 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.431425 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phc24_aa5d6fee-c188-4cee-a9dd-cc90927bef31/kube-multus-additional-cni-plugins/0.log" Apr 21 05:00:55.457793 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.457751 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phc24_aa5d6fee-c188-4cee-a9dd-cc90927bef31/egress-router-binary-copy/0.log" Apr 21 05:00:55.481327 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.481308 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phc24_aa5d6fee-c188-4cee-a9dd-cc90927bef31/cni-plugins/0.log" Apr 21 05:00:55.502431 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.502411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phc24_aa5d6fee-c188-4cee-a9dd-cc90927bef31/bond-cni-plugin/0.log" Apr 21 05:00:55.521531 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.521513 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phc24_aa5d6fee-c188-4cee-a9dd-cc90927bef31/routeoverride-cni/0.log" Apr 21 05:00:55.541294 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.541270 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phc24_aa5d6fee-c188-4cee-a9dd-cc90927bef31/whereabouts-cni-bincopy/0.log" Apr 21 05:00:55.564149 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.564124 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-phc24_aa5d6fee-c188-4cee-a9dd-cc90927bef31/whereabouts-cni/0.log" Apr 21 05:00:55.634238 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.634208 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zgz2x_494fdf31-01dd-419f-a5be-8ea679099b8e/kube-multus/0.log" Apr 21 05:00:55.658426 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.658394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gpfrt_a42d1f99-c5af-45d0-9ce8-8affe0d01ea4/network-metrics-daemon/0.log" Apr 21 05:00:55.677208 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:55.677184 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gpfrt_a42d1f99-c5af-45d0-9ce8-8affe0d01ea4/kube-rbac-proxy/0.log" Apr 21 05:00:57.119358 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.119322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/ovn-controller/0.log" Apr 21 05:00:57.157430 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.157409 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/ovn-acl-logging/0.log" Apr 21 05:00:57.175415 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.175392 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/kube-rbac-proxy-node/0.log" Apr 21 05:00:57.195040 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.195008 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 05:00:57.214070 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.214049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/northd/0.log" Apr 21 05:00:57.234728 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.234710 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/nbdb/0.log" Apr 21 05:00:57.258099 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.258079 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/sbdb/0.log" Apr 21 05:00:57.379582 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:57.379524 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxcbd_9acdf950-6bdd-4903-943b-90a6f96b5271/ovnkube-controller/0.log" Apr 21 05:00:58.303541 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:58.303513 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-tvjbx_64ecb179-9e59-428a-8a4d-a5bfdc94ac99/check-endpoints/0.log" Apr 21 05:00:58.324022 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:58.323998 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9cslm_541c9c2c-7803-469b-9f76-fa3ec6995458/network-check-target-container/0.log" Apr 21 05:00:59.279599 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:59.279570 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rzjbf_14f705da-fc93-4286-8c99-eb46f7420053/iptables-alerter/0.log" Apr 21 05:00:59.876894 ip-10-0-134-15 kubenswrapper[2575]: I0421 05:00:59.876870 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ldkbb_021801b3-5414-43aa-8164-e9365f642f91/tuned/0.log"