Apr 16 22:11:06.128557 ip-10-0-133-183 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:11:06.128568 ip-10-0-133-183 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:11:06.128574 ip-10-0-133-183 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:11:06.128792 ip-10-0-133-183 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:11:16.152831 ip-10-0-133-183 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:11:16.152848 ip-10-0-133-183 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot dd8e0618194f4040bcd9e856b14f7c3c -- Apr 16 22:13:40.759613 ip-10-0-133-183 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:41.189874 ip-10-0-133-183 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:41.189874 ip-10-0-133-183 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:41.189874 ip-10-0-133-183 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:41.189874 ip-10-0-133-183 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:41.189874 ip-10-0-133-183 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:41.192096 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.191998 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:41.196558 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196529 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:41.196558 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196557 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:41.196558 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196562 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:41.196558 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196566 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:41.196558 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196569 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:41.196558 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196571 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196574 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196578 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196581 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196584 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196586 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196589 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196592 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196600 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196603 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196606 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196610 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196613 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196616 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196619 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196622 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196625 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196628 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196630 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:41.196731 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196633 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196636 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196638 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196642 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196644 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196647 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196649 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196652 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196654 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196657 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196660 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196662 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196664 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196667 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196669 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196673 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196676 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196679 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196682 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196684 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:41.197180 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196687 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196689 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196692 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196695 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196697 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196700 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196704 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196709 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196712 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196715 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196718 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196720 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196724 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196727 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196730 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196732 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196735 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196737 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196740 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:41.197726 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196744 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196748 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196751 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196754 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196756 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196759 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196762 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196764 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196768 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196771 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196773 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196777 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196781 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196783 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196787 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196789 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196792 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196794 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196797 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196799 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:41.198188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196802 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196804 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.196807 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197207 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197212 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197215 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197218 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197221 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197224 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197226 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197229 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197231 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197235 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197238 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197241 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197244 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197246 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197249 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197251 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197254 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:41.198688 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197257 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197260 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197264 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197268 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197271 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197274 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197277 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197279 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197282 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197284 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197286 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197289 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197291 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197294 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197297 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197299 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197302 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197304 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197307 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:41.199169 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197309 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197313 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197315 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197318 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197333 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197338 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197341 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197343 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197346 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197349 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197351 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197354 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197357 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197360 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197362 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197367 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197370 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197374 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197377 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:41.199659 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197380 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197383 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197386 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197388 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197391 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197393 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197396 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197398 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197401 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197403 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197406 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197408 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197411 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197413 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197416 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197418 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197421 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197423 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197427 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197430 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:41.200134 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197432 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197435 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197438 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197440 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197443 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197446 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197448 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197450 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197453 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197456 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.197459 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199138 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199148 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199156 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199161 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199166 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199170 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199175 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199180 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199183 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199187 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199190 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:41.200657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199194 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199197 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199200 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199203 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199206 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199209 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199212 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199214 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199219 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199222 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199225 2576 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199228 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199232 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199236 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199239 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199242 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199246 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199249 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199252 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199256 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199259 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199263 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199267 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199270 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199273 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:41.201204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199276 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199279 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199282 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199288 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199291 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199294 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199297 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199300 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199304 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199307 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199310 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199313 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199316 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199319 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199338 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199341 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199345 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199348 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199351 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199355 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199358 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199361 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199365 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199368 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199371 2576 flags.go:64] FLAG: --help="false" Apr 16 22:13:41.201827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199375 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199378 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199381 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199385 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199388 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199392 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199395 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199398 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199401 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199404 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199407 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199410 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199413 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199416 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199419 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199422 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199425 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199428 2576 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199431 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199434 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199437 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199443 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199447 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199450 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:41.202440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199453 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199456 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199459 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199462 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199465 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199469 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199472 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199476 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199480 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199483 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199486 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199489 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199492 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199495 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199499 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199507 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199510 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199513 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199517 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199519 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199526 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199529 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199533 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199536 2576 flags.go:64] FLAG: --port="10250" Apr 16 22:13:41.203052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199539 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199542 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a39e3d604cef3c6a" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199546 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199549 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199552 2576 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199555 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199558 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199562 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199565 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199568 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199571 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199574 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199577 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199580 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199586 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199589 2576 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199592 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199596 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199599 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199602 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199605 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199609 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199613 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199616 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199619 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199622 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:41.203669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199625 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199628 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199631 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199634 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199637 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199643 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199646 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199648 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199652 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199656 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199658 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199661 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199665 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199668 2576 flags.go:64] FLAG: --v="2" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199673 2576 flags.go:64] FLAG: --version="false" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199677 2576 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199681 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.199685 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199785 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199789 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199793 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199796 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199799 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199801 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:41.204277 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199804 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199807 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199810 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199813 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199816 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199818 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199821 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199824 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199827 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199829 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199832 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199835 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199837 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199839 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199842 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199844 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199847 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199850 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199852 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199855 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:41.204883 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199858 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199860 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199863 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199865 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199868 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199870 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199873 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199877 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199882 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199885 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199888 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199891 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199893 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199896 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199898 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199901 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199903 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199906 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199909 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:41.205494 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199911 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199914 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199917 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199919 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199922 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199925 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199927 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199929 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199932 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199934 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199937 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199940 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199943 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199945 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199948 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199950 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199953 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199955 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199959 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:41.205991 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199963 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199966 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199970 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199973 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199975 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199978 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199980 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199982 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199985 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199987 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199990 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199992 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.199997 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200000 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200002 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200005 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200007 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200010 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200012 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200018 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:41.206520 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200021 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:41.207204 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.200024 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:41.207204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.200741 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:41.210052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.210025 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:41.210052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.210049 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210101 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210107 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210110 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210114 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210117 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210120 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210123 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210126 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210130 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210132 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210135 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210138 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210141 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210144 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210147 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210149 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210152 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210155 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:41.210188 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210158 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210160 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210163 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210166 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210168 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210172 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210174 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210177 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210180 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210182 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210185 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210188 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210190 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210193 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210196 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210199 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210202 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210206 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210210 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:41.210680 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210215 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210218 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210221 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210224 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210227 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210230 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210232 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210235 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210237 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210240 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210243 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210245 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210248 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210252 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210256 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210259 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210262 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210265 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210268 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210270 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:41.211144 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210273 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210275 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210278 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210281 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210283 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210286 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210288 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210291 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210294 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210296 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210299 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210301 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210304 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210306 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210309 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210311 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210314 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210316 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210331 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210335 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:41.211650 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210337 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210340 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210342 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210345 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210347 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210350 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210352 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210356 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210359 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.210364 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210485 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210491 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210494 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210497 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210500 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:41.212127 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210503 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210506 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210509 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210512 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210514 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210517 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210519 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210522 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210524 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210527 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210529 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210532 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210534 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210537 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210539 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210542 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210545 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210548 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210550 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210553 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:41.212594 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210555 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210558 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210560 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210563 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210566 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210568 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210572 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210574 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210577 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210579 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210582 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210584 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210587 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210589 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210592 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210594 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210597 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210599 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210602 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210604 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:41.213086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210607 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210609 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210612 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210614 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210617 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210619 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210622 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210624 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210626 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210629 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210632 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210635 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210638 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210640 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210642 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210645 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210648 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210651 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210656 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:41.213591 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210660 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210664 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210667 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210670 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210673 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210675 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210678 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210680 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210683 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210686 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210689 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210692 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210694 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210697 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210699 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210702 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210705 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210707 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210710 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210712 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:41.214065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210714 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:41.214606 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:41.210717 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:41.214606 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.210722 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:41.214606 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.211528 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:41.214606 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.214250 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:41.215125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.215113 2576 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:41.215234 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.215216 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:41.215289 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.215268 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:41.239933 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.239903 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:41.243881 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.243848 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:41.257788 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.257766 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:41.265397 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.265377 2576 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:41.269810 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.269794 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:41.272852 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.272835 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:41.274549 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.274518 2576 fs.go:135] Filesystem UUIDs: map[0a5e0bd2-7b39-41d1-8bcb-ed7259e6bf77:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c45f6364-cd28-4394-9167-4f78415cb51d:/dev/nvme0n1p3] Apr 16 22:13:41.274602 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.274552 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:41.281622 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.281510 2576 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:41.279471146 +0000 UTC m=+0.404095424 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100345 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b342333d624a410e24d6130d06982 SystemUUID:ec2b3423-33d6-24a4-10e2-4d6130d06982 BootID:dd8e0618-194f-4040-bcd9-e856b14f7c3c Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f0:1c:5d:5a:f5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f0:1c:5d:5a:f5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:44:41:f4:27:c6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:41.281622 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.281615 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:41.281765 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.281701 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:41.282720 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.282690 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:41.282914 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.282724 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-183.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:41.283006 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.282927 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:41.283006 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.282940 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:41.283006 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.282959 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:41.283751 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.283739 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:41.284639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.284626 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:41.284766 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.284755 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:41.288204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.288192 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:41.288271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.288211 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:41.288271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.288230 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:41.288271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.288244 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:41.288271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.288257 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:41.289909 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.289895 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:41.289987 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.289919 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:41.293169 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.293120 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:41.294470 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.294456 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:41.296190 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296172 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vzrct" Apr 16 22:13:41.296395 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296384 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296402 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296419 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296425 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296431 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296436 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296441 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296447 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:41.296453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296455 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:41.296661 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296462 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:41.296661 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296471 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:41.296661 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.296479 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:41.298090 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.298080 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:41.298090 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.298090 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:41.301293 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.301269 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:41.301293 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.301286 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-183.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:41.301901 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.301889 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:41.301936 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.301926 2576 server.go:1295] "Started kubelet" Apr 16 22:13:41.302085 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.302030 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:41.302170 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.302044 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:41.302170 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.302158 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:41.302776 ip-10-0-133-183 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:41.303303 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.303220 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:41.304936 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.304922 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:41.308412 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.308387 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:41.308412 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.308395 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:41.308590 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.308567 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vzrct" Apr 16 22:13:41.309429 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.309409 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:41.309533 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.309428 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:41.309533 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.309431 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:41.309533 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.309458 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:41.309691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.309572 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:41.309691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.309584 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:41.310958 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.310909 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-183.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 22:13:41.311064 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.311019 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-183.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:41.311559 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.311545 2576 factory.go:55] Registering systemd factory Apr 16 22:13:41.311673 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.310685 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-183.ec2.internal.18a6f608f0a736d4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-183.ec2.internal,UID:ip-10-0-133-183.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-183.ec2.internal,},FirstTimestamp:2026-04-16 22:13:41.301901012 +0000 UTC m=+0.426525289,LastTimestamp:2026-04-16 22:13:41.301901012 +0000 UTC m=+0.426525289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-183.ec2.internal,}" Apr 16 22:13:41.311806 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.311702 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:41.313683 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.313661 2576 factory.go:153] Registering CRI-O factory Apr 16 22:13:41.313792 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.313782 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:41.313930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.313912 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:41.314025 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.313944 2576 factory.go:103] Registering Raw factory Apr 16 22:13:41.314025 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.313961 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:41.314371 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.314345 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:41.314667 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.314649 2576 manager.go:319] Starting recovery of all containers Apr 16 22:13:41.319098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.319070 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:41.319550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.319509 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:41.326510 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.326391 2576 manager.go:324] Recovery completed Apr 16 22:13:41.330618 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.330604 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:41.333196 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.333181 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:41.333257 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.333210 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:41.333257 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.333223 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:41.333697 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.333685 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:41.333741 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.333697 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:41.333741 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.333716 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:41.336024 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.336013 2576 policy_none.go:49] "None policy: Start" Apr 16 22:13:41.336081 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.336028 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:41.336081 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.336037 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.381915 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.381947 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.381967 2576 server.go:85] "Starting device plugin registration server" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.382320 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.382395 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.382493 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.382557 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:41.383150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.382566 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:41.383607 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.383161 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:41.383607 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.383209 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:41.448105 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.448034 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:41.448105 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.448068 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:41.448105 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.448087 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:41.448105 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.448094 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:41.448357 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.448125 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:41.451362 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.451339 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:41.483250 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.483221 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:41.484340 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.484311 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:41.484419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.484354 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:41.484419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.484365 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:41.484419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.484387 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.491149 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.491133 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.491198 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.491158 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-183.ec2.internal\": node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:41.510681 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.510657 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:41.548428 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.548388 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal"] Apr 16 22:13:41.548505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.548486 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:41.549460 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.549437 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:41.549587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.549467 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:41.549587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.549478 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:41.550729 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.550715 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:41.550908 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.550890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.550975 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.550929 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:41.551498 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.551472 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:41.551580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.551507 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:41.551580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.551517 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:41.551580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.551478 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:41.551580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.551562 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:41.551580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.551572 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:41.552716 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.552703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.552766 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.552726 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:41.553439 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.553424 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:41.553521 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.553450 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:41.553521 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.553465 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:41.582431 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.582407 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-183.ec2.internal\" not found" node="ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.586812 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.586793 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-183.ec2.internal\" not found" node="ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.610873 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.610842 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:41.611001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.610906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/08cb6c990c70793064f4bf15d19ee216-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal\" (UID: \"08cb6c990c70793064f4bf15d19ee216\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.611001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.610930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08cb6c990c70793064f4bf15d19ee216-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal\" (UID: \"08cb6c990c70793064f4bf15d19ee216\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.611001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.610949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/91f722200116ede2db7277922fd7931a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-183.ec2.internal\" (UID: \"91f722200116ede2db7277922fd7931a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.711537 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.711462 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:41.711627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.711529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/91f722200116ede2db7277922fd7931a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-183.ec2.internal\" (UID: \"91f722200116ede2db7277922fd7931a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.711627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.711557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/08cb6c990c70793064f4bf15d19ee216-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal\" (UID: \"08cb6c990c70793064f4bf15d19ee216\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.711627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.711574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08cb6c990c70793064f4bf15d19ee216-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal\" (UID: \"08cb6c990c70793064f4bf15d19ee216\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.711627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.711614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08cb6c990c70793064f4bf15d19ee216-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal\" (UID: \"08cb6c990c70793064f4bf15d19ee216\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.711751 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.711626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/91f722200116ede2db7277922fd7931a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-183.ec2.internal\" (UID: \"91f722200116ede2db7277922fd7931a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.711751 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.711675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/08cb6c990c70793064f4bf15d19ee216-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal\" (UID: \"08cb6c990c70793064f4bf15d19ee216\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.812231 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.812194 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:41.884458 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.884426 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.890054 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:41.890032 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" Apr 16 22:13:41.912924 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:41.912900 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:42.013476 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.013396 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:42.113752 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.113723 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-183.ec2.internal\" not found" Apr 16 22:13:42.119776 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.119755 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:42.209183 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.209151 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:42.214659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.214627 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:42.214802 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.214780 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:42.214802 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.214783 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:42.214892 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.214792 2576 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a9066cf2dc2f44dec9f839e2f9f8b9ef-64e3d764390a030a.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.133.183:52230->44.219.145.83:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" Apr 16 22:13:42.214892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.214816 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" Apr 16 22:13:42.214892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.214796 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:42.230776 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.230752 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:42.289416 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.289388 2576 apiserver.go:52] "Watching apiserver" Apr 16 22:13:42.301529 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.301504 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:42.302568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.302544 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-mfrqz","openshift-multus/multus-xxsfw","openshift-multus/network-metrics-daemon-6dklm","openshift-network-diagnostics/network-check-target-kd4tx","kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal","openshift-dns/node-resolver-58875","openshift-image-registry/node-ca-t27sn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal","openshift-multus/multus-additional-cni-plugins-p6fh4","openshift-network-operator/iptables-alerter-xbw9g","openshift-ovn-kubernetes/ovnkube-node-l8m6g","kube-system/konnectivity-agent-lsjm9","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn"] Apr 16 22:13:42.305474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.305450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.306609 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.306587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.307603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.307583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:42.307729 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.307706 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:42.308166 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:42.308290 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308270 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:42.308376 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308290 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:42.308376 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308298 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5bh96\"" Apr 16 22:13:42.308494 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308445 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:42.308722 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308704 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:42.308722 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308722 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:42.308864 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.308785 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:42.308864 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308800 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.308979 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.308885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:42.309219 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.309204 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lvjq\"" Apr 16 22:13:42.309273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.309245 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:42.309360 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.309270 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:42.309943 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.309920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.310989 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.310966 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:41 +0000 UTC" deadline="2027-10-21 00:02:39.130108003 +0000 UTC" Apr 16 22:13:42.311111 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.310989 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13249h48m56.819121669s" Apr 16 22:13:42.311111 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.311031 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:42.311308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.311291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.311380 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.311317 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:42.311453 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.311438 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k2vjm\"" Apr 16 22:13:42.311926 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.311910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6j4vn\"" Apr 16 22:13:42.312019 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.311997 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:42.312212 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.312198 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:42.312530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.312514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.314210 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314187 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:42.314338 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vpbk8\"" Apr 16 22:13:42.314338 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314309 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:42.314338 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a33450b-5146-4096-a0eb-79767266b790-hosts-file\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.314509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysctl-conf\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.314509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314409 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:42.314509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-var-lib-kubelet\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.314509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/534a804f-de9c-430e-9e0a-47849b4977da-serviceca\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.314509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwqg9\" (UniqueName: \"kubernetes.io/projected/534a804f-de9c-430e-9e0a-47849b4977da-kube-api-access-wwqg9\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53a1b81a-7c98-464a-b673-d8b7022f892d-cni-binary-copy\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvz5\" (UniqueName: \"kubernetes.io/projected/53a1b81a-7c98-464a-b673-d8b7022f892d-kube-api-access-5dvz5\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-run\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-lib-modules\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-cni-multus\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-hostroot\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zrjd\" (UniqueName: \"kubernetes.io/projected/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-kube-api-access-4zrjd\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-os-release\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.314718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-conf-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a33450b-5146-4096-a0eb-79767266b790-tmp-dir\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zb8p\" (UniqueName: \"kubernetes.io/projected/7a33450b-5146-4096-a0eb-79767266b790-kube-api-access-7zb8p\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-systemd\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-sys\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-multus-certs\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-etc-kubernetes\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysconfig\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-tmp\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-cni-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-socket-dir-parent\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-netns\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.314998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/534a804f-de9c-430e-9e0a-47849b4977da-host\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-cni-bin\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-daemon-config\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-modprobe-d\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-kubernetes\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-tuned\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-system-cni-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315155 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jhztv\"" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315164 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-k8s-cni-cncf-io\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315219 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysctl-d\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-host\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsl97\" (UniqueName: \"kubernetes.io/projected/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-kube-api-access-tsl97\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-cnibin\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-kubelet\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.315818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.315446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:42.316308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.316011 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.316846 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.316811 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:42.317400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.317381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.317545 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.317525 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:42.317781 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.317767 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:42.317974 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.317955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:42.318131 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.318113 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:42.318223 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.318181 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:42.318779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.318660 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:42.318779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.318775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dkljr\"" Apr 16 22:13:42.318968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.318938 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-82tk7\"" Apr 16 22:13:42.319065 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.319046 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:42.319253 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.319235 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:42.320388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.319708 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:42.320388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.320161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jgllw\"" Apr 16 22:13:42.320388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.320174 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:42.322841 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.322826 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:42.339037 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.339017 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-f5xtw" Apr 16 22:13:42.346549 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.346529 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-f5xtw" Apr 16 22:13:42.381031 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.381000 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08cb6c990c70793064f4bf15d19ee216.slice/crio-31750973e0a54208ce9a1901607f7ae61d9ccdb7b9306f8959b2c1db8c12e610 WatchSource:0}: Error finding container 31750973e0a54208ce9a1901607f7ae61d9ccdb7b9306f8959b2c1db8c12e610: Status 404 returned error can't find the container with id 31750973e0a54208ce9a1901607f7ae61d9ccdb7b9306f8959b2c1db8c12e610 Apr 16 22:13:42.381344 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.381312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f722200116ede2db7277922fd7931a.slice/crio-19282c23abdb80e2dca39fa4517865bed31f3d8277a2ea6cedfd4fdad526266f WatchSource:0}: Error finding container 19282c23abdb80e2dca39fa4517865bed31f3d8277a2ea6cedfd4fdad526266f: Status 404 returned error can't find the container with id 19282c23abdb80e2dca39fa4517865bed31f3d8277a2ea6cedfd4fdad526266f Apr 16 22:13:42.387111 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.387096 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:42.410021 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.410003 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:42.416155 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zrjd\" (UniqueName: \"kubernetes.io/projected/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-kube-api-access-4zrjd\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:42.416242 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.416242 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.416242 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovnkube-script-lib\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.416348 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.416386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-env-overrides\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.416386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovn-node-metrics-cert\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.416467 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.416467 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-socket-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.416467 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-etc-selinux\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.416566 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-os-release\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416566 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-conf-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416566 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:42.416674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-conf-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-os-release\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-system-cni-dir\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.416674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a33450b-5146-4096-a0eb-79767266b790-tmp-dir\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.416674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-multus-certs\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-etc-kubernetes\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416877 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.416739 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:42.416877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-multus-certs\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-etc-kubernetes\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.416877 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.416818 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:42.916785974 +0000 UTC m=+2.041410239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:42.416877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cnibin\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.416877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-os-release\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-cni-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.416977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a33450b-5146-4096-a0eb-79767266b790-tmp-dir\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-socket-dir-parent\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-netns\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-cni-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-systemd-units\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-netns\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-socket-dir-parent\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-run-netns\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.417175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-log-socket\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/534a804f-de9c-430e-9e0a-47849b4977da-host\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-cni-bin\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/534a804f-de9c-430e-9e0a-47849b4977da-host\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-daemon-config\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jg7n\" (UniqueName: \"kubernetes.io/projected/55cc55b1-436a-4a28-81cd-ec449dee73fb-kube-api-access-8jg7n\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-registration-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzcb\" (UniqueName: \"kubernetes.io/projected/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-kube-api-access-xrzcb\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-kubernetes\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-kubernetes\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417477 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4337d15c-2043-4ee1-a3d7-09710bf7d026-iptables-alerter-script\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-node-log\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-cni-netd\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-device-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.417680 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-host\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsl97\" (UniqueName: \"kubernetes.io/projected/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-kube-api-access-tsl97\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-cnibin\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-host\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-kubelet\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/887606c2-6d2b-4ac2-820a-f012e1cb7100-agent-certs\") pod \"konnectivity-agent-lsjm9\" (UID: \"887606c2-6d2b-4ac2-820a-f012e1cb7100\") " pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-cni-bin\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-cnibin\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a33450b-5146-4096-a0eb-79767266b790-hosts-file\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/534a804f-de9c-430e-9e0a-47849b4977da-serviceca\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwqg9\" (UniqueName: \"kubernetes.io/projected/534a804f-de9c-430e-9e0a-47849b4977da-kube-api-access-wwqg9\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a33450b-5146-4096-a0eb-79767266b790-hosts-file\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53a1b81a-7c98-464a-b673-d8b7022f892d-cni-binary-copy\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53a1b81a-7c98-464a-b673-d8b7022f892d-multus-daemon-config\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.417975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-var-lib-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-run\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-lib-modules\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-cni-multus\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-hostroot\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-run\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttq8g\" (UniqueName: \"kubernetes.io/projected/4337d15c-2043-4ee1-a3d7-09710bf7d026-kube-api-access-ttq8g\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-ovn\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-cni-multus\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-cni-bin\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-hostroot\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-lib-modules\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-sys-fs\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/534a804f-de9c-430e-9e0a-47849b4977da-serviceca\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zb8p\" (UniqueName: \"kubernetes.io/projected/7a33450b-5146-4096-a0eb-79767266b790-kube-api-access-7zb8p\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-systemd\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-sys\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysconfig\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-tmp\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53a1b81a-7c98-464a-b673-d8b7022f892d-cni-binary-copy\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-systemd\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.418964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.418916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysconfig\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgn8\" (UniqueName: \"kubernetes.io/projected/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-kube-api-access-4zgn8\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4337d15c-2043-4ee1-a3d7-09710bf7d026-host-slash\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-slash\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-systemd\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-etc-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/887606c2-6d2b-4ac2-820a-f012e1cb7100-konnectivity-ca\") pod \"konnectivity-agent-lsjm9\" (UID: \"887606c2-6d2b-4ac2-820a-f012e1cb7100\") " pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-modprobe-d\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-tuned\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-system-cni-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-k8s-cni-cncf-io\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovnkube-config\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysctl-d\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-kubelet\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysctl-conf\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.419746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-var-lib-kubelet\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.420479 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.419756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvz5\" (UniqueName: \"kubernetes.io/projected/53a1b81a-7c98-464a-b673-d8b7022f892d-kube-api-access-5dvz5\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.420479 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.420136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-sys\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.420571 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.420497 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:42.421050 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.421027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-system-cni-dir\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.421278 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.421255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-var-lib-kubelet\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.421386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.421260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-var-lib-kubelet\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.421386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.421297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysctl-d\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.421386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.421057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53a1b81a-7c98-464a-b673-d8b7022f892d-host-run-k8s-cni-cncf-io\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.421507 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.421372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-modprobe-d\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.421507 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.421439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-sysctl-conf\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.423457 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.423432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-tmp\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.423534 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.423464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-etc-tuned\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.424015 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.423997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zrjd\" (UniqueName: \"kubernetes.io/projected/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-kube-api-access-4zrjd\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:42.429593 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.429575 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:42.429688 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.429638 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:42.429688 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.429671 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f5c7k for pod openshift-network-diagnostics/network-check-target-kd4tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:42.429799 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.429727 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k podName:c5bcd735-429b-49bf-8436-33eb976d199a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:42.929710264 +0000 UTC m=+2.054334540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f5c7k" (UniqueName: "kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k") pod "network-check-target-kd4tx" (UID: "c5bcd735-429b-49bf-8436-33eb976d199a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:42.430242 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.430224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsl97\" (UniqueName: \"kubernetes.io/projected/6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74-kube-api-access-tsl97\") pod \"tuned-mfrqz\" (UID: \"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.431456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.431438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zb8p\" (UniqueName: \"kubernetes.io/projected/7a33450b-5146-4096-a0eb-79767266b790-kube-api-access-7zb8p\") pod \"node-resolver-58875\" (UID: \"7a33450b-5146-4096-a0eb-79767266b790\") " pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.431799 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.431778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvz5\" (UniqueName: \"kubernetes.io/projected/53a1b81a-7c98-464a-b673-d8b7022f892d-kube-api-access-5dvz5\") pod \"multus-xxsfw\" (UID: \"53a1b81a-7c98-464a-b673-d8b7022f892d\") " pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.432137 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.432123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwqg9\" (UniqueName: \"kubernetes.io/projected/534a804f-de9c-430e-9e0a-47849b4977da-kube-api-access-wwqg9\") pod \"node-ca-t27sn\" (UID: \"534a804f-de9c-430e-9e0a-47849b4977da\") " pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.451534 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.451489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" event={"ID":"91f722200116ede2db7277922fd7931a","Type":"ContainerStarted","Data":"19282c23abdb80e2dca39fa4517865bed31f3d8277a2ea6cedfd4fdad526266f"} Apr 16 22:13:42.452450 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.452431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" event={"ID":"08cb6c990c70793064f4bf15d19ee216","Type":"ContainerStarted","Data":"31750973e0a54208ce9a1901607f7ae61d9ccdb7b9306f8959b2c1db8c12e610"} Apr 16 22:13:42.520243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.520243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.520243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovnkube-script-lib\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.520243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.520243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-env-overrides\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovn-node-metrics-cert\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-socket-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-etc-selinux\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-etc-selinux\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.520583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-socket-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-system-cni-dir\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cnibin\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-os-release\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-system-cni-dir\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-systemd-units\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-os-release\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-run-netns\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-env-overrides\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cnibin\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-run-netns\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-log-socket\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jg7n\" (UniqueName: \"kubernetes.io/projected/55cc55b1-436a-4a28-81cd-ec449dee73fb-kube-api-access-8jg7n\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-systemd-units\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-log-socket\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-registration-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovnkube-script-lib\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.520972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-registration-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzcb\" (UniqueName: \"kubernetes.io/projected/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-kube-api-access-xrzcb\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4337d15c-2043-4ee1-a3d7-09710bf7d026-iptables-alerter-script\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-node-log\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-cni-netd\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-device-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-cni-netd\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-kubelet\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-device-dir\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/887606c2-6d2b-4ac2-820a-f012e1cb7100-agent-certs\") pod \"konnectivity-agent-lsjm9\" (UID: \"887606c2-6d2b-4ac2-820a-f012e1cb7100\") " pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-kubelet\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-node-log\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.521699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-var-lib-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttq8g\" (UniqueName: \"kubernetes.io/projected/4337d15c-2043-4ee1-a3d7-09710bf7d026-kube-api-access-ttq8g\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-ovn\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-cni-bin\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-var-lib-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-sys-fs\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4337d15c-2043-4ee1-a3d7-09710bf7d026-iptables-alerter-script\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-ovn\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-cni-bin\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgn8\" (UniqueName: \"kubernetes.io/projected/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-kube-api-access-4zgn8\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4337d15c-2043-4ee1-a3d7-09710bf7d026-host-slash\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-sys-fs\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-slash\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4337d15c-2043-4ee1-a3d7-09710bf7d026-host-slash\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-systemd\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-etc-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-host-slash\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-etc-openvswitch\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55cc55b1-436a-4a28-81cd-ec449dee73fb-run-systemd\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/887606c2-6d2b-4ac2-820a-f012e1cb7100-konnectivity-ca\") pod \"konnectivity-agent-lsjm9\" (UID: \"887606c2-6d2b-4ac2-820a-f012e1cb7100\") " pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.521826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovnkube-config\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.522210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovnkube-config\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.522284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.522319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/887606c2-6d2b-4ac2-820a-f012e1cb7100-konnectivity-ca\") pod \"konnectivity-agent-lsjm9\" (UID: \"887606c2-6d2b-4ac2-820a-f012e1cb7100\") " pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.522944 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.522837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55cc55b1-436a-4a28-81cd-ec449dee73fb-ovn-node-metrics-cert\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.523438 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.523423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/887606c2-6d2b-4ac2-820a-f012e1cb7100-agent-certs\") pod \"konnectivity-agent-lsjm9\" (UID: \"887606c2-6d2b-4ac2-820a-f012e1cb7100\") " pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.529431 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.529411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgn8\" (UniqueName: \"kubernetes.io/projected/a0f43c4e-755a-43dd-96e1-ee4825dcce6e-kube-api-access-4zgn8\") pod \"multus-additional-cni-plugins-p6fh4\" (UID: \"a0f43c4e-755a-43dd-96e1-ee4825dcce6e\") " pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.529594 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.529579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttq8g\" (UniqueName: \"kubernetes.io/projected/4337d15c-2043-4ee1-a3d7-09710bf7d026-kube-api-access-ttq8g\") pod \"iptables-alerter-xbw9g\" (UID: \"4337d15c-2043-4ee1-a3d7-09710bf7d026\") " pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.529718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.529698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzcb\" (UniqueName: \"kubernetes.io/projected/ec2d038b-93fc-4afc-81a0-2e23ea05d7f1-kube-api-access-xrzcb\") pod \"aws-ebs-csi-driver-node-v7jbn\" (UID: \"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.529773 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.529758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jg7n\" (UniqueName: \"kubernetes.io/projected/55cc55b1-436a-4a28-81cd-ec449dee73fb-kube-api-access-8jg7n\") pod \"ovnkube-node-l8m6g\" (UID: \"55cc55b1-436a-4a28-81cd-ec449dee73fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.632983 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.632957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t27sn" Apr 16 22:13:42.639086 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.639059 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod534a804f_de9c_430e_9e0a_47849b4977da.slice/crio-4f139fca8d1a493a8cfa00cc76df50629910157eaa97da20d5f8aa89c0338d46 WatchSource:0}: Error finding container 4f139fca8d1a493a8cfa00cc76df50629910157eaa97da20d5f8aa89c0338d46: Status 404 returned error can't find the container with id 4f139fca8d1a493a8cfa00cc76df50629910157eaa97da20d5f8aa89c0338d46 Apr 16 22:13:42.650024 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.650007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxsfw" Apr 16 22:13:42.655887 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.655865 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53a1b81a_7c98_464a_b673_d8b7022f892d.slice/crio-bafeee7cbce918f562b3c93cee8af0cb58a3235b5dc6a3901268b5416369c4cd WatchSource:0}: Error finding container bafeee7cbce918f562b3c93cee8af0cb58a3235b5dc6a3901268b5416369c4cd: Status 404 returned error can't find the container with id bafeee7cbce918f562b3c93cee8af0cb58a3235b5dc6a3901268b5416369c4cd Apr 16 22:13:42.656429 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.656401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58875" Apr 16 22:13:42.663879 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.663856 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a33450b_5146_4096_a0eb_79767266b790.slice/crio-13de637a9b817b0da1f8637fd3244411d812b4c925464923a36edff81744d512 WatchSource:0}: Error finding container 13de637a9b817b0da1f8637fd3244411d812b4c925464923a36edff81744d512: Status 404 returned error can't find the container with id 13de637a9b817b0da1f8637fd3244411d812b4c925464923a36edff81744d512 Apr 16 22:13:42.681873 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.681852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" Apr 16 22:13:42.687224 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.687203 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcc3e13_0c5e_4d1f_bbf6_d609cae55a74.slice/crio-de862884ae9d6d6250b7ff5f970ff96a1fb793683a9d9f63bc998529be485f1d WatchSource:0}: Error finding container de862884ae9d6d6250b7ff5f970ff96a1fb793683a9d9f63bc998529be485f1d: Status 404 returned error can't find the container with id de862884ae9d6d6250b7ff5f970ff96a1fb793683a9d9f63bc998529be485f1d Apr 16 22:13:42.688218 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.688202 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" Apr 16 22:13:42.693637 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.693618 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f43c4e_755a_43dd_96e1_ee4825dcce6e.slice/crio-72c9316e640a2e2b9ad298d619b38fd9831472c8489444f35746dc5d2a2cb7a3 WatchSource:0}: Error finding container 72c9316e640a2e2b9ad298d619b38fd9831472c8489444f35746dc5d2a2cb7a3: Status 404 returned error can't find the container with id 72c9316e640a2e2b9ad298d619b38fd9831472c8489444f35746dc5d2a2cb7a3 Apr 16 22:13:42.694536 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.694512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xbw9g" Apr 16 22:13:42.700637 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.700616 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:13:42.700953 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.700932 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4337d15c_2043_4ee1_a3d7_09710bf7d026.slice/crio-344b04a16ef7ca1ae8bfd20d15fa774287ec07612baea90db383b05ad38b095e WatchSource:0}: Error finding container 344b04a16ef7ca1ae8bfd20d15fa774287ec07612baea90db383b05ad38b095e: Status 404 returned error can't find the container with id 344b04a16ef7ca1ae8bfd20d15fa774287ec07612baea90db383b05ad38b095e Apr 16 22:13:42.706494 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.706476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:13:42.707475 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.707453 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55cc55b1_436a_4a28_81cd_ec449dee73fb.slice/crio-abefd24e48c4e2da637c939c217de7b70556a278863345b8c4652c83b2dd30f9 WatchSource:0}: Error finding container abefd24e48c4e2da637c939c217de7b70556a278863345b8c4652c83b2dd30f9: Status 404 returned error can't find the container with id abefd24e48c4e2da637c939c217de7b70556a278863345b8c4652c83b2dd30f9 Apr 16 22:13:42.710696 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.710676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" Apr 16 22:13:42.714106 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.712987 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod887606c2_6d2b_4ac2_820a_f012e1cb7100.slice/crio-5bd9cc63373c0cc102390a0b7b5389f1da067b3246e76be0fda5b3c08970269c WatchSource:0}: Error finding container 5bd9cc63373c0cc102390a0b7b5389f1da067b3246e76be0fda5b3c08970269c: Status 404 returned error can't find the container with id 5bd9cc63373c0cc102390a0b7b5389f1da067b3246e76be0fda5b3c08970269c Apr 16 22:13:42.719031 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:13:42.719012 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2d038b_93fc_4afc_81a0_2e23ea05d7f1.slice/crio-b83d61166074948ee939e3206cbaf06c0e8c956cabccd85c9f29cdb67ff99a5a WatchSource:0}: Error finding container b83d61166074948ee939e3206cbaf06c0e8c956cabccd85c9f29cdb67ff99a5a: Status 404 returned error can't find the container with id b83d61166074948ee939e3206cbaf06c0e8c956cabccd85c9f29cdb67ff99a5a Apr 16 22:13:42.788781 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.788696 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:42.925060 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:42.925018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:42.925246 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.925167 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:42.925246 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:42.925232 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:43.92521354 +0000 UTC m=+3.049837812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:43.026254 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.026118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:43.026254 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:43.026187 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:43.026254 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:43.026212 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:43.026254 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:43.026225 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f5c7k for pod openshift-network-diagnostics/network-check-target-kd4tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.026593 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:43.026294 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k podName:c5bcd735-429b-49bf-8436-33eb976d199a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:44.026273432 +0000 UTC m=+3.150897705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f5c7k" (UniqueName: "kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k") pod "network-check-target-kd4tx" (UID: "c5bcd735-429b-49bf-8436-33eb976d199a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.348243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.348037 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:42 +0000 UTC" deadline="2027-12-09 12:32:51.016164076 +0000 UTC" Apr 16 22:13:43.348243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.348080 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14438h19m7.668088872s" Apr 16 22:13:43.462541 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.462351 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:43.470005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.469927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lsjm9" event={"ID":"887606c2-6d2b-4ac2-820a-f012e1cb7100","Type":"ContainerStarted","Data":"5bd9cc63373c0cc102390a0b7b5389f1da067b3246e76be0fda5b3c08970269c"} Apr 16 22:13:43.482621 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.482548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"abefd24e48c4e2da637c939c217de7b70556a278863345b8c4652c83b2dd30f9"} Apr 16 22:13:43.504543 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.504511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xbw9g" event={"ID":"4337d15c-2043-4ee1-a3d7-09710bf7d026","Type":"ContainerStarted","Data":"344b04a16ef7ca1ae8bfd20d15fa774287ec07612baea90db383b05ad38b095e"} Apr 16 22:13:43.519263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.519231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58875" event={"ID":"7a33450b-5146-4096-a0eb-79767266b790","Type":"ContainerStarted","Data":"13de637a9b817b0da1f8637fd3244411d812b4c925464923a36edff81744d512"} Apr 16 22:13:43.558119 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.556662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxsfw" event={"ID":"53a1b81a-7c98-464a-b673-d8b7022f892d","Type":"ContainerStarted","Data":"bafeee7cbce918f562b3c93cee8af0cb58a3235b5dc6a3901268b5416369c4cd"} Apr 16 22:13:43.566470 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.566439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t27sn" event={"ID":"534a804f-de9c-430e-9e0a-47849b4977da","Type":"ContainerStarted","Data":"4f139fca8d1a493a8cfa00cc76df50629910157eaa97da20d5f8aa89c0338d46"} Apr 16 22:13:43.581123 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.581053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" event={"ID":"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1","Type":"ContainerStarted","Data":"b83d61166074948ee939e3206cbaf06c0e8c956cabccd85c9f29cdb67ff99a5a"} Apr 16 22:13:43.596576 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.596542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerStarted","Data":"72c9316e640a2e2b9ad298d619b38fd9831472c8489444f35746dc5d2a2cb7a3"} Apr 16 22:13:43.621127 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.621055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" event={"ID":"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74","Type":"ContainerStarted","Data":"de862884ae9d6d6250b7ff5f970ff96a1fb793683a9d9f63bc998529be485f1d"} Apr 16 22:13:43.800737 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.800704 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:43.934319 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:43.933686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:43.934319 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:43.933847 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:43.934319 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:43.933911 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.933892039 +0000 UTC m=+5.058516316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:44.035628 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.035017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:44.035628 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.035186 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:44.035628 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.035204 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:44.035628 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.035215 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f5c7k for pod openshift-network-diagnostics/network-check-target-kd4tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:44.035628 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.035269 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k podName:c5bcd735-429b-49bf-8436-33eb976d199a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.035251527 +0000 UTC m=+5.159875807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f5c7k" (UniqueName: "kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k") pod "network-check-target-kd4tx" (UID: "c5bcd735-429b-49bf-8436-33eb976d199a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:44.348534 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.348495 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:42 +0000 UTC" deadline="2027-12-17 17:58:47.441624216 +0000 UTC" Apr 16 22:13:44.348534 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.348531 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14635h45m3.093096406s" Apr 16 22:13:44.449098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.449066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:44.449269 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.449169 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:44.449444 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.449428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:44.449536 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.449520 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:44.467666 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.467641 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:44.560667 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.560623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kzvxc"] Apr 16 22:13:44.562967 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.562942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.563086 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.563026 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:13:44.639497 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.639415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c77a6141-b229-4002-8fb2-722d0a6093bb-kubelet-config\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.639497 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.639488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c77a6141-b229-4002-8fb2-722d0a6093bb-dbus\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.639731 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.639520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.741400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.740561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c77a6141-b229-4002-8fb2-722d0a6093bb-kubelet-config\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.741400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.740646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c77a6141-b229-4002-8fb2-722d0a6093bb-dbus\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.741400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.740681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.741400 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.740837 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:44.741400 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:44.740897 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret podName:c77a6141-b229-4002-8fb2-722d0a6093bb nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.240878453 +0000 UTC m=+4.365502723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret") pod "global-pull-secret-syncer-kzvxc" (UID: "c77a6141-b229-4002-8fb2-722d0a6093bb") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:44.741400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.741174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c77a6141-b229-4002-8fb2-722d0a6093bb-kubelet-config\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:44.741400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:44.741340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c77a6141-b229-4002-8fb2-722d0a6093bb-dbus\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:45.245520 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:45.245488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:45.245709 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:45.245636 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:45.245769 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:45.245729 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret podName:c77a6141-b229-4002-8fb2-722d0a6093bb nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.245674404 +0000 UTC m=+5.370298669 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret") pod "global-pull-secret-syncer-kzvxc" (UID: "c77a6141-b229-4002-8fb2-722d0a6093bb") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:45.950834 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:45.950165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:45.950834 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:45.950361 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.950834 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:45.950429 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.950408064 +0000 UTC m=+9.075032336 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:46.051135 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:46.051097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:46.051319 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.051287 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:46.051319 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.051309 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:46.051319 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.051335 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f5c7k for pod openshift-network-diagnostics/network-check-target-kd4tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:46.051511 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.051394 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k podName:c5bcd735-429b-49bf-8436-33eb976d199a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:50.051377073 +0000 UTC m=+9.176001361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f5c7k" (UniqueName: "kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k") pod "network-check-target-kd4tx" (UID: "c5bcd735-429b-49bf-8436-33eb976d199a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:46.252840 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:46.252705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:46.253005 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.252877 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:46.253005 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.252940 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret podName:c77a6141-b229-4002-8fb2-722d0a6093bb nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.252922382 +0000 UTC m=+7.377546661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret") pod "global-pull-secret-syncer-kzvxc" (UID: "c77a6141-b229-4002-8fb2-722d0a6093bb") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:46.449079 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:46.449044 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:46.449258 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.449177 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:13:46.449258 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:46.449251 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:46.449412 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.449390 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:46.449412 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:46.449393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:46.449523 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:46.449484 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:48.272907 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:48.272493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:48.272907 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:48.272649 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:48.272907 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:48.272837 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret podName:c77a6141-b229-4002-8fb2-722d0a6093bb nodeName:}" failed. No retries permitted until 2026-04-16 22:13:52.272814937 +0000 UTC m=+11.397439217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret") pod "global-pull-secret-syncer-kzvxc" (UID: "c77a6141-b229-4002-8fb2-722d0a6093bb") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:48.448565 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:48.448531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:48.448758 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:48.448532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:48.448758 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:48.448670 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:13:48.448866 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:48.448804 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:48.448866 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:48.448544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:48.448972 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:48.448891 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:49.987069 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:49.986960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:49.987531 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:49.987111 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.987531 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:49.987181 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:57.987164014 +0000 UTC m=+17.111788284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:50.088022 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:50.087925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:50.088200 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:50.088130 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:50.088200 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:50.088155 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:50.088200 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:50.088166 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f5c7k for pod openshift-network-diagnostics/network-check-target-kd4tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:50.088423 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:50.088221 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k podName:c5bcd735-429b-49bf-8436-33eb976d199a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:58.088201897 +0000 UTC m=+17.212826166 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f5c7k" (UniqueName: "kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k") pod "network-check-target-kd4tx" (UID: "c5bcd735-429b-49bf-8436-33eb976d199a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:50.448378 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:50.448313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:50.448558 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:50.448480 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:13:50.448558 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:50.448313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:50.448692 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:50.448626 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:50.448692 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:50.448313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:50.448795 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:50.448702 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:52.305939 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:52.305902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:52.306438 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:52.306060 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:52.306438 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:52.306130 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret podName:c77a6141-b229-4002-8fb2-722d0a6093bb nodeName:}" failed. No retries permitted until 2026-04-16 22:14:00.306114643 +0000 UTC m=+19.430738917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret") pod "global-pull-secret-syncer-kzvxc" (UID: "c77a6141-b229-4002-8fb2-722d0a6093bb") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:52.449141 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:52.449103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:52.449141 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:52.449124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:52.449357 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:52.449104 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:52.449357 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:52.449219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:13:52.449357 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:52.449314 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:52.449502 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:52.449409 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:54.449061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:54.449029 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:54.449061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:54.449061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:54.449516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:54.449069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:54.449516 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:54.449146 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:54.449516 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:54.449308 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:13:54.449516 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:54.449442 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:56.449249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:56.449212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:56.449670 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:56.449212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:56.449670 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:56.449366 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:56.449670 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:56.449212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:56.449670 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:56.449443 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:13:56.449670 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:56.449506 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:58.049336 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:58.049293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:58.049770 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.049461 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:58.049770 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.049539 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:14.049517869 +0000 UTC m=+33.174142142 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:58.149968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:58.149930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:58.150133 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.150109 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:58.150207 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.150136 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:58.150207 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.150148 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f5c7k for pod openshift-network-diagnostics/network-check-target-kd4tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:58.150309 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.150210 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k podName:c5bcd735-429b-49bf-8436-33eb976d199a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:14.150190927 +0000 UTC m=+33.274815215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f5c7k" (UniqueName: "kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k") pod "network-check-target-kd4tx" (UID: "c5bcd735-429b-49bf-8436-33eb976d199a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:58.448920 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:58.448891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:13:58.449083 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:58.448891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:13:58.449083 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:13:58.448891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:13:58.449083 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.449064 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:13:58.449221 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.449154 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:13:58.449279 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:13:58.449249 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:00.366948 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:00.366911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:00.367415 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:00.367092 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:00.367415 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:00.367173 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret podName:c77a6141-b229-4002-8fb2-722d0a6093bb nodeName:}" failed. No retries permitted until 2026-04-16 22:14:16.367153008 +0000 UTC m=+35.491777281 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret") pod "global-pull-secret-syncer-kzvxc" (UID: "c77a6141-b229-4002-8fb2-722d0a6093bb") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:00.449138 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:00.449031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:00.449317 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:00.449031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:00.449317 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:00.449153 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:00.449317 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:00.449211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:00.449317 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:00.449296 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:00.449552 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:00.449383 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:01.669242 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.669009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"7a57ba6a51d9bf8eee44abce3cf5c4797b143939b0872ee3eaf09e4338e2a280"} Apr 16 22:14:01.670044 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.669260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"d8d60756e346e009cea3b7e00618899d5ba6d47549d9d0f93d10fd1602c01153"} Apr 16 22:14:01.670044 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.669279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"434a00d945d9143782ccf79a472f605d4e04996780f7662444075e52d9bfcdd0"} Apr 16 22:14:01.670044 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.669294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"ab4f082e710d0ac7b424357cc8042bde5638f354bb34ea4680a816d082df13cb"} Apr 16 22:14:01.679240 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.679210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxsfw" event={"ID":"53a1b81a-7c98-464a-b673-d8b7022f892d","Type":"ContainerStarted","Data":"ecd814bb4ae220bde28867ae2ca100014d89cd8fe243071147a669f82410922f"} Apr 16 22:14:01.682814 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.682263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" event={"ID":"91f722200116ede2db7277922fd7931a","Type":"ContainerStarted","Data":"c1313780f5734ff54ba342925e55031ad9233408f8f4f64fee2181e5251e4eae"} Apr 16 22:14:01.685068 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.685045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" event={"ID":"6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74","Type":"ContainerStarted","Data":"f970e3b71d5a2e73590477b35f640053caa660b07405b085a65a5b338d9313f4"} Apr 16 22:14:01.696540 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.696494 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xxsfw" podStartSLOduration=2.742524888 podStartE2EDuration="20.696478779s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.65740928 +0000 UTC m=+1.782033546" lastFinishedPulling="2026-04-16 22:14:00.611363169 +0000 UTC m=+19.735987437" observedRunningTime="2026-04-16 22:14:01.69616389 +0000 UTC m=+20.820788178" watchObservedRunningTime="2026-04-16 22:14:01.696478779 +0000 UTC m=+20.821103067" Apr 16 22:14:01.717247 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.717174 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-183.ec2.internal" podStartSLOduration=19.717155964 podStartE2EDuration="19.717155964s" podCreationTimestamp="2026-04-16 22:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:01.716495597 +0000 UTC m=+20.841119885" watchObservedRunningTime="2026-04-16 22:14:01.717155964 +0000 UTC m=+20.841780267" Apr 16 22:14:01.743403 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:01.741383 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mfrqz" podStartSLOduration=2.84051609 podStartE2EDuration="20.7413635s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.688580983 +0000 UTC m=+1.813205247" lastFinishedPulling="2026-04-16 22:14:00.589428393 +0000 UTC m=+19.714052657" observedRunningTime="2026-04-16 22:14:01.740288967 +0000 UTC m=+20.864913257" watchObservedRunningTime="2026-04-16 22:14:01.7413635 +0000 UTC m=+20.865987787" Apr 16 22:14:02.448273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.448244 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:02.448430 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.448245 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:02.448430 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:02.448365 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:02.448430 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.448255 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:02.448539 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:02.448433 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:02.448539 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:02.448493 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:02.688693 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.688658 2576 generic.go:358] "Generic (PLEG): container finished" podID="a0f43c4e-755a-43dd-96e1-ee4825dcce6e" containerID="ba35c7ecba9fd60fa9be03285ad381087c6cb3c7b31d364ac0a074712b660230" exitCode=0 Apr 16 22:14:02.689119 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.688749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerDied","Data":"ba35c7ecba9fd60fa9be03285ad381087c6cb3c7b31d364ac0a074712b660230"} Apr 16 22:14:02.690810 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.690778 2576 generic.go:358] "Generic (PLEG): container finished" podID="08cb6c990c70793064f4bf15d19ee216" containerID="cff245a2724acd6496e202e57b4de39ac53238c2a30dc72632d749cbceb3be5e" exitCode=0 Apr 16 22:14:02.691112 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.691072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" event={"ID":"08cb6c990c70793064f4bf15d19ee216","Type":"ContainerDied","Data":"cff245a2724acd6496e202e57b4de39ac53238c2a30dc72632d749cbceb3be5e"} Apr 16 22:14:02.692546 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.692516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lsjm9" event={"ID":"887606c2-6d2b-4ac2-820a-f012e1cb7100","Type":"ContainerStarted","Data":"a7b9fcaf423cffa1efc6facf7b11fa288b3c6163a05d2d480d4ecce1f3764dc5"} Apr 16 22:14:02.698446 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.698389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"0b3c244eb264d50b3a104b2375d30848e8519eba1cf5ba8f9ff1fed0220837f9"} Apr 16 22:14:02.698446 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.698421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"ce992cb0bc9c7364440fd944b64c4548f0282043bfc1528015edd8c845f5e824"} Apr 16 22:14:02.699863 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.699829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xbw9g" event={"ID":"4337d15c-2043-4ee1-a3d7-09710bf7d026","Type":"ContainerStarted","Data":"71db5b16c6ba61edc1049bff90984013a03bc9dacdc2f1f9cf3de79c2943cf40"} Apr 16 22:14:02.701266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.701241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58875" event={"ID":"7a33450b-5146-4096-a0eb-79767266b790","Type":"ContainerStarted","Data":"8932cc707d2fdbd917bd668d1ca5d93db0f63a75f61af2d514afb715ede7a441"} Apr 16 22:14:02.702612 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.702587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t27sn" event={"ID":"534a804f-de9c-430e-9e0a-47849b4977da","Type":"ContainerStarted","Data":"63efa41eea0cc29c84a07b18bb87d022f40fe6ece38bc25b747929da1cea023a"} Apr 16 22:14:02.703998 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.703975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" event={"ID":"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1","Type":"ContainerStarted","Data":"27fda951eddae01c35ea1808b68c954b4378132b841cf63dbf40d6644879ddd9"} Apr 16 22:14:02.732808 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.732768 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-58875" podStartSLOduration=3.802896433 podStartE2EDuration="21.732755986s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.664963194 +0000 UTC m=+1.789587459" lastFinishedPulling="2026-04-16 22:14:00.594822733 +0000 UTC m=+19.719447012" observedRunningTime="2026-04-16 22:14:02.73262805 +0000 UTC m=+21.857252336" watchObservedRunningTime="2026-04-16 22:14:02.732755986 +0000 UTC m=+21.857380272" Apr 16 22:14:02.748776 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.748728 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xbw9g" podStartSLOduration=3.864357714 podStartE2EDuration="21.748712431s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.70353412 +0000 UTC m=+1.828158386" lastFinishedPulling="2026-04-16 22:14:00.587888818 +0000 UTC m=+19.712513103" observedRunningTime="2026-04-16 22:14:02.748235357 +0000 UTC m=+21.872859649" watchObservedRunningTime="2026-04-16 22:14:02.748712431 +0000 UTC m=+21.873336717" Apr 16 22:14:02.766074 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.766022 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lsjm9" podStartSLOduration=3.894458715 podStartE2EDuration="21.76600437s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.716071837 +0000 UTC m=+1.840696103" lastFinishedPulling="2026-04-16 22:14:00.587617489 +0000 UTC m=+19.712241758" observedRunningTime="2026-04-16 22:14:02.764552077 +0000 UTC m=+21.889176365" watchObservedRunningTime="2026-04-16 22:14:02.76600437 +0000 UTC m=+21.890628658" Apr 16 22:14:02.798546 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.798511 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-t27sn" podStartSLOduration=8.196459952 podStartE2EDuration="21.798497821s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.641059256 +0000 UTC m=+1.765683525" lastFinishedPulling="2026-04-16 22:13:56.243097114 +0000 UTC m=+15.367721394" observedRunningTime="2026-04-16 22:14:02.798278627 +0000 UTC m=+21.922902905" watchObservedRunningTime="2026-04-16 22:14:02.798497821 +0000 UTC m=+21.923122156" Apr 16 22:14:02.962832 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:02.962800 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:14:03.394577 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:03.394437 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:14:02.962817146Z","UUID":"be7521fb-257d-4acc-97ab-76e6a55960f7","Handler":null,"Name":"","Endpoint":""} Apr 16 22:14:03.397047 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:03.397020 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:14:03.397184 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:03.397055 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:14:03.707970 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:03.707935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" event={"ID":"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1","Type":"ContainerStarted","Data":"4587bd14d635d6dc9f38111292ce0d76a82d89fe0bc760051e6a0c3d1cf9a364"} Apr 16 22:14:03.710592 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:03.710496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" event={"ID":"08cb6c990c70793064f4bf15d19ee216","Type":"ContainerStarted","Data":"97793a1ed696b4536d524dbc82559997a84501ffe35fa908c0fc2cef5ae24d79"} Apr 16 22:14:03.727054 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:03.727002 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-183.ec2.internal" podStartSLOduration=21.72698427 podStartE2EDuration="21.72698427s" podCreationTimestamp="2026-04-16 22:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:03.726468564 +0000 UTC m=+22.851092853" watchObservedRunningTime="2026-04-16 22:14:03.72698427 +0000 UTC m=+22.851608560" Apr 16 22:14:04.448624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:04.448592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:04.448624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:04.448612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:04.448832 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:04.448592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:04.448832 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:04.448734 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:04.448832 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:04.448770 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:04.448994 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:04.448865 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:04.715967 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:04.715893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"239b9a9eaa0cc480efeaecf5fea92dcde01b441d724170fffee4da7bdbc01031"} Apr 16 22:14:04.717931 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:04.717889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" event={"ID":"ec2d038b-93fc-4afc-81a0-2e23ea05d7f1","Type":"ContainerStarted","Data":"d96a1bee53d1dfd9fdaa71f284be95d3fd40889db263388347f089524514b257"} Apr 16 22:14:04.735836 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:04.735797 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v7jbn" podStartSLOduration=2.47787964 podStartE2EDuration="23.735783919s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.720480637 +0000 UTC m=+1.845104916" lastFinishedPulling="2026-04-16 22:14:03.978384915 +0000 UTC m=+23.103009195" observedRunningTime="2026-04-16 22:14:04.735122355 +0000 UTC m=+23.859746675" watchObservedRunningTime="2026-04-16 22:14:04.735783919 +0000 UTC m=+23.860408238" Apr 16 22:14:06.448851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.448811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:06.449297 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.448816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:06.449297 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:06.448944 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:06.449297 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:06.449050 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:06.449297 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.448829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:06.449297 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:06.449171 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:06.681648 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.681469 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:14:06.682399 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.682355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:14:06.725239 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.725140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" event={"ID":"55cc55b1-436a-4a28-81cd-ec449dee73fb","Type":"ContainerStarted","Data":"4f10d0795741f4cf4444bb0c6501434fac10bc330273be473043bae580fd4955"} Apr 16 22:14:06.725563 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.725541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:14:06.725654 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.725582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:14:06.728102 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.728050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerStarted","Data":"83b3ea06e2ffa7b0d8d2f8e085bc6262002a9b6a966cbc8817ad0d804d95b3f7"} Apr 16 22:14:06.728346 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.728310 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:14:06.728896 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.728877 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lsjm9" Apr 16 22:14:06.741872 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.741849 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:14:06.757063 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:06.757024 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" podStartSLOduration=7.089054887 podStartE2EDuration="25.7570109s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.709279402 +0000 UTC m=+1.833903670" lastFinishedPulling="2026-04-16 22:14:01.377235415 +0000 UTC m=+20.501859683" observedRunningTime="2026-04-16 22:14:06.756958549 +0000 UTC m=+25.881582848" watchObservedRunningTime="2026-04-16 22:14:06.7570109 +0000 UTC m=+25.881635186" Apr 16 22:14:07.732167 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:07.732128 2576 generic.go:358] "Generic (PLEG): container finished" podID="a0f43c4e-755a-43dd-96e1-ee4825dcce6e" containerID="83b3ea06e2ffa7b0d8d2f8e085bc6262002a9b6a966cbc8817ad0d804d95b3f7" exitCode=0 Apr 16 22:14:07.733153 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:07.732223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerDied","Data":"83b3ea06e2ffa7b0d8d2f8e085bc6262002a9b6a966cbc8817ad0d804d95b3f7"} Apr 16 22:14:07.733153 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:07.732683 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:14:07.751720 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:07.751697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:14:08.448911 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.448761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:08.448988 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.448761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:08.449020 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:08.448989 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:08.449156 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:08.449067 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:08.449156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.448761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:08.449156 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:08.449143 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:08.510596 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.510563 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6dklm"] Apr 16 22:14:08.511141 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.511115 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kd4tx"] Apr 16 22:14:08.513071 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.513050 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kzvxc"] Apr 16 22:14:08.736178 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.736145 2576 generic.go:358] "Generic (PLEG): container finished" podID="a0f43c4e-755a-43dd-96e1-ee4825dcce6e" containerID="0b48ca2d6d834c904724453221f6d2c22b46de7cfc1d0051245ab207bfeefe53" exitCode=0 Apr 16 22:14:08.736602 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.736251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerDied","Data":"0b48ca2d6d834c904724453221f6d2c22b46de7cfc1d0051245ab207bfeefe53"} Apr 16 22:14:08.736602 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.736284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:08.737346 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:08.736794 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:08.737346 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.736822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:08.737346 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:08.736906 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:08.737346 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:08.737108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:08.737346 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:08.737200 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:09.742897 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:09.742863 2576 generic.go:358] "Generic (PLEG): container finished" podID="a0f43c4e-755a-43dd-96e1-ee4825dcce6e" containerID="5b42e85cf7eaf60605430a623d5721aefaa42d77ca9df5a035ab98d9c8815b15" exitCode=0 Apr 16 22:14:09.743281 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:09.742950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerDied","Data":"5b42e85cf7eaf60605430a623d5721aefaa42d77ca9df5a035ab98d9c8815b15"} Apr 16 22:14:10.448748 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:10.448693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:10.448748 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:10.448731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:10.448950 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:10.448731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:10.448950 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:10.448841 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:10.449032 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:10.448944 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:10.449062 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:10.449032 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:12.448549 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:12.448507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:12.448549 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:12.448531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:12.448964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:12.448505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:12.448964 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:12.448642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:14:12.448964 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:12.448752 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd4tx" podUID="c5bcd735-429b-49bf-8436-33eb976d199a" Apr 16 22:14:12.448964 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:12.448859 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kzvxc" podUID="c77a6141-b229-4002-8fb2-722d0a6093bb" Apr 16 22:14:14.078193 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.078101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:14.078659 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.078264 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:14.078659 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.078359 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:46.078337676 +0000 UTC m=+65.202961962 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:14.178999 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.178959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:14.179183 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.179146 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:14.179183 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.179169 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:14.179183 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.179182 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f5c7k for pod openshift-network-diagnostics/network-check-target-kd4tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:14.179357 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.179246 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k podName:c5bcd735-429b-49bf-8436-33eb976d199a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:46.17922718 +0000 UTC m=+65.303851448 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-f5c7k" (UniqueName: "kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k") pod "network-check-target-kd4tx" (UID: "c5bcd735-429b-49bf-8436-33eb976d199a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:14.249779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.249747 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-183.ec2.internal" event="NodeReady" Apr 16 22:14:14.249960 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.249898 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:14.293795 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.293761 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-898dq"] Apr 16 22:14:14.317887 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.317852 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sbt4r"] Apr 16 22:14:14.317887 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.317888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:14.320428 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.320384 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:14.320589 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.320445 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:14.320589 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.320482 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bd46\"" Apr 16 22:14:14.320589 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.320498 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:14.338734 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.338671 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-898dq"] Apr 16 22:14:14.338734 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.338710 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sbt4r"] Apr 16 22:14:14.338734 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.338730 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.341160 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.341122 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:14.341160 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.341137 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:14.341347 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.341127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6fpb6\"" Apr 16 22:14:14.448702 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.448501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:14.448702 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.448634 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:14.448951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.448885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:14.451718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.451570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:14.451718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.451594 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:14:14.451718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.451646 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:14.451951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.451734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lt98\"" Apr 16 22:14:14.451951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.451819 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gj5qm\"" Apr 16 22:14:14.451951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.451895 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:14.481041 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.481015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ffb347-b866-40c3-b690-eb1a564849ef-config-volume\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.481161 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.481140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vc69\" (UniqueName: \"kubernetes.io/projected/c2ffb347-b866-40c3-b690-eb1a564849ef-kube-api-access-9vc69\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.481228 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.481177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:14.481228 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.481211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2ffb347-b866-40c3-b690-eb1a564849ef-tmp-dir\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.481317 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.481227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmkx\" (UniqueName: \"kubernetes.io/projected/3fede498-aa63-44a3-8ef2-e602f7ca7131-kube-api-access-2kmkx\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:14.481317 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.481256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.582425 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.582387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.582620 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.582449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ffb347-b866-40c3-b690-eb1a564849ef-config-volume\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.582620 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.582536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vc69\" (UniqueName: \"kubernetes.io/projected/c2ffb347-b866-40c3-b690-eb1a564849ef-kube-api-access-9vc69\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.582620 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.582564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:14.582620 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.582581 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:14.582620 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.582599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2ffb347-b866-40c3-b690-eb1a564849ef-tmp-dir\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.582620 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.582620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmkx\" (UniqueName: \"kubernetes.io/projected/3fede498-aa63-44a3-8ef2-e602f7ca7131-kube-api-access-2kmkx\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:14.582899 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.582671 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:14:15.08265117 +0000 UTC m=+34.207275455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:14:14.582899 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.582736 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:14.582899 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:14.582798 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:15.082782136 +0000 UTC m=+34.207406414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:14:14.583123 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.583098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2ffb347-b866-40c3-b690-eb1a564849ef-tmp-dir\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.583264 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.583246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ffb347-b866-40c3-b690-eb1a564849ef-config-volume\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.593693 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.593636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vc69\" (UniqueName: \"kubernetes.io/projected/c2ffb347-b866-40c3-b690-eb1a564849ef-kube-api-access-9vc69\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:14.593862 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:14.593842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmkx\" (UniqueName: \"kubernetes.io/projected/3fede498-aa63-44a3-8ef2-e602f7ca7131-kube-api-access-2kmkx\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:15.086835 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:15.086798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:15.087393 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:15.086974 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:15.087393 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:15.087002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:15.087393 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:15.087063 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:14:16.087042552 +0000 UTC m=+35.211666830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:14:15.087393 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:15.087096 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:15.087393 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:15.087147 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:16.087135084 +0000 UTC m=+35.211759356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:14:16.094976 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.094938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:16.095589 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.095045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:16.095589 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:16.095105 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:16.095589 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:16.095158 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:16.095589 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:16.095180 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:14:18.095161189 +0000 UTC m=+37.219785455 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:14:16.095589 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:16.095215 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:18.095200006 +0000 UTC m=+37.219824286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:14:16.398241 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.398210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:16.411344 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.411293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c77a6141-b229-4002-8fb2-722d0a6093bb-original-pull-secret\") pod \"global-pull-secret-syncer-kzvxc\" (UID: \"c77a6141-b229-4002-8fb2-722d0a6093bb\") " pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:16.559919 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.559887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kzvxc" Apr 16 22:14:16.728960 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.728923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kzvxc"] Apr 16 22:14:16.733122 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:14:16.733081 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77a6141_b229_4002_8fb2_722d0a6093bb.slice/crio-631299813f7b865b564c38c2553cdba92bcda5e3bcc352cd3e35cc47769e851e WatchSource:0}: Error finding container 631299813f7b865b564c38c2553cdba92bcda5e3bcc352cd3e35cc47769e851e: Status 404 returned error can't find the container with id 631299813f7b865b564c38c2553cdba92bcda5e3bcc352cd3e35cc47769e851e Apr 16 22:14:16.757487 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.757309 2576 generic.go:358] "Generic (PLEG): container finished" podID="a0f43c4e-755a-43dd-96e1-ee4825dcce6e" containerID="882d4c846fd2efb92a68f97b23f5849d9ce293831aa833aca35d00b3abd21018" exitCode=0 Apr 16 22:14:16.757630 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.757357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerDied","Data":"882d4c846fd2efb92a68f97b23f5849d9ce293831aa833aca35d00b3abd21018"} Apr 16 22:14:16.758530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:16.758508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kzvxc" event={"ID":"c77a6141-b229-4002-8fb2-722d0a6093bb","Type":"ContainerStarted","Data":"631299813f7b865b564c38c2553cdba92bcda5e3bcc352cd3e35cc47769e851e"} Apr 16 22:14:17.765090 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:17.765053 2576 generic.go:358] "Generic (PLEG): container finished" podID="a0f43c4e-755a-43dd-96e1-ee4825dcce6e" containerID="dde4669d4069668982a0a012f08568a6dc7b62c24758b592f88ed443a0571064" exitCode=0 Apr 16 22:14:17.765718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:17.765103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerDied","Data":"dde4669d4069668982a0a012f08568a6dc7b62c24758b592f88ed443a0571064"} Apr 16 22:14:18.110606 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:18.110566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:18.110780 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:18.110641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:18.110780 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:18.110726 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:18.110887 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:18.110786 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:18.110887 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:18.110801 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:22.110781914 +0000 UTC m=+41.235406194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:14:18.110887 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:18.110839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:14:22.110822511 +0000 UTC m=+41.235446789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:14:18.770796 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:18.770739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" event={"ID":"a0f43c4e-755a-43dd-96e1-ee4825dcce6e","Type":"ContainerStarted","Data":"868f62c89a876cf631b68637c961ed71b304210a7199a84e7db4582a0823582a"} Apr 16 22:14:18.796438 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:18.796374 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p6fh4" podStartSLOduration=4.859508476 podStartE2EDuration="37.796356579s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:13:42.695081135 +0000 UTC m=+1.819705400" lastFinishedPulling="2026-04-16 22:14:15.631929229 +0000 UTC m=+34.756553503" observedRunningTime="2026-04-16 22:14:18.795847243 +0000 UTC m=+37.920471530" watchObservedRunningTime="2026-04-16 22:14:18.796356579 +0000 UTC m=+37.920980876" Apr 16 22:14:21.777725 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:21.777688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kzvxc" event={"ID":"c77a6141-b229-4002-8fb2-722d0a6093bb","Type":"ContainerStarted","Data":"1a5398682099e56ef1cbab86fc58dfee50691cb0f44f91be8822ae6083d09395"} Apr 16 22:14:21.802843 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:21.802799 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kzvxc" podStartSLOduration=33.914875754 podStartE2EDuration="37.802784678s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:14:16.73497746 +0000 UTC m=+35.859601726" lastFinishedPulling="2026-04-16 22:14:20.622886385 +0000 UTC m=+39.747510650" observedRunningTime="2026-04-16 22:14:21.8023911 +0000 UTC m=+40.927015387" watchObservedRunningTime="2026-04-16 22:14:21.802784678 +0000 UTC m=+40.927408965" Apr 16 22:14:22.139826 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:22.139732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:22.139826 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:22.139788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:22.140000 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:22.139873 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:22.140000 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:22.139876 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:22.140000 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:22.139923 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:14:30.139912441 +0000 UTC m=+49.264536718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:14:22.140000 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:22.139936 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:30.139930665 +0000 UTC m=+49.264554930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:14:30.193607 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:30.193565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:30.193607 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:30.193611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:30.194025 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:30.193707 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:30.194025 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:30.193765 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:46.193752224 +0000 UTC m=+65.318376494 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:14:30.194025 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:30.193713 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:30.194025 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:30.193844 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:14:46.193825081 +0000 UTC m=+65.318449364 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:14:32.664632 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.664597 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2"] Apr 16 22:14:32.667477 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.667460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.669875 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.669851 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:32.670016 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.669895 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:32.670873 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.670851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:32.670965 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.670858 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 22:14:32.675669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.675649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2"] Apr 16 22:14:32.709863 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.709833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38bedc04-9717-48cb-bbe2-f7b213757480-tmp\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.710009 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.709902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwfzr\" (UniqueName: \"kubernetes.io/projected/38bedc04-9717-48cb-bbe2-f7b213757480-kube-api-access-mwfzr\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.710009 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.709935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/38bedc04-9717-48cb-bbe2-f7b213757480-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.811070 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.811033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38bedc04-9717-48cb-bbe2-f7b213757480-tmp\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.811249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.811105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwfzr\" (UniqueName: \"kubernetes.io/projected/38bedc04-9717-48cb-bbe2-f7b213757480-kube-api-access-mwfzr\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.811249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.811140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/38bedc04-9717-48cb-bbe2-f7b213757480-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.811838 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.811812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38bedc04-9717-48cb-bbe2-f7b213757480-tmp\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.816369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.816346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/38bedc04-9717-48cb-bbe2-f7b213757480-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.820095 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.820061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwfzr\" (UniqueName: \"kubernetes.io/projected/38bedc04-9717-48cb-bbe2-f7b213757480-kube-api-access-mwfzr\") pod \"klusterlet-addon-workmgr-5d8d748b5b-m2jl2\" (UID: \"38bedc04-9717-48cb-bbe2-f7b213757480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:32.977474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:32.977438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:33.089624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:33.089562 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2"] Apr 16 22:14:33.093246 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:14:33.093202 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38bedc04_9717_48cb_bbe2_f7b213757480.slice/crio-4867e85829a6217eb13c1166535417ddd2d6857069aa4a69de626d42861694e2 WatchSource:0}: Error finding container 4867e85829a6217eb13c1166535417ddd2d6857069aa4a69de626d42861694e2: Status 404 returned error can't find the container with id 4867e85829a6217eb13c1166535417ddd2d6857069aa4a69de626d42861694e2 Apr 16 22:14:33.800451 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:33.800403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" event={"ID":"38bedc04-9717-48cb-bbe2-f7b213757480","Type":"ContainerStarted","Data":"4867e85829a6217eb13c1166535417ddd2d6857069aa4a69de626d42861694e2"} Apr 16 22:14:37.811030 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:37.810993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" event={"ID":"38bedc04-9717-48cb-bbe2-f7b213757480","Type":"ContainerStarted","Data":"60998fd019ba650a6a8c3b29b928a34f571d1366a15261e5061078aa77842e11"} Apr 16 22:14:37.811460 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:37.811196 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:37.812951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:37.812928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:14:37.826071 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:37.826020 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" podStartSLOduration=1.896560574 podStartE2EDuration="5.826006203s" podCreationTimestamp="2026-04-16 22:14:32 +0000 UTC" firstStartedPulling="2026-04-16 22:14:33.094840839 +0000 UTC m=+52.219465104" lastFinishedPulling="2026-04-16 22:14:37.024286457 +0000 UTC m=+56.148910733" observedRunningTime="2026-04-16 22:14:37.825840155 +0000 UTC m=+56.950464442" watchObservedRunningTime="2026-04-16 22:14:37.826006203 +0000 UTC m=+56.950630491" Apr 16 22:14:39.753232 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:39.753200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l8m6g" Apr 16 22:14:46.111405 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.111367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:14:46.114193 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.114173 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:46.121617 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:46.121600 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:46.121670 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:46.121660 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:50.121641887 +0000 UTC m=+129.246266152 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : secret "metrics-daemon-secret" not found Apr 16 22:14:46.212085 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.212049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:14:46.212204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.212099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:46.212204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.212135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:14:46.212299 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:46.212217 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:46.212299 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:46.212221 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:46.212299 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:46.212278 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:18.212266241 +0000 UTC m=+97.336890506 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:14:46.212299 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:14:46.212294 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:15:18.212286195 +0000 UTC m=+97.336910463 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:14:46.215016 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.214997 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:46.224238 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.224220 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:46.235230 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.235213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5c7k\" (UniqueName: \"kubernetes.io/projected/c5bcd735-429b-49bf-8436-33eb976d199a-kube-api-access-f5c7k\") pod \"network-check-target-kd4tx\" (UID: \"c5bcd735-429b-49bf-8436-33eb976d199a\") " pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:46.269779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.269755 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lt98\"" Apr 16 22:14:46.277722 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.277703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:46.385243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.385176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kd4tx"] Apr 16 22:14:46.388750 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:14:46.388720 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5bcd735_429b_49bf_8436_33eb976d199a.slice/crio-162466128eea93610dcd2ed88b0bc14f82d7192c9661a609bd91f14164e568d4 WatchSource:0}: Error finding container 162466128eea93610dcd2ed88b0bc14f82d7192c9661a609bd91f14164e568d4: Status 404 returned error can't find the container with id 162466128eea93610dcd2ed88b0bc14f82d7192c9661a609bd91f14164e568d4 Apr 16 22:14:46.829525 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:46.829481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kd4tx" event={"ID":"c5bcd735-429b-49bf-8436-33eb976d199a","Type":"ContainerStarted","Data":"162466128eea93610dcd2ed88b0bc14f82d7192c9661a609bd91f14164e568d4"} Apr 16 22:14:49.840789 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:49.840747 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kd4tx" event={"ID":"c5bcd735-429b-49bf-8436-33eb976d199a","Type":"ContainerStarted","Data":"18aee14e31de5e1308a8071410f33ce7e9917507be67e35cfd3edbe87311a256"} Apr 16 22:14:49.841165 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:49.840885 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:14:49.856046 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:14:49.855991 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kd4tx" podStartSLOduration=66.244938059 podStartE2EDuration="1m8.855974869s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:14:46.390514417 +0000 UTC m=+65.515138682" lastFinishedPulling="2026-04-16 22:14:49.001551227 +0000 UTC m=+68.126175492" observedRunningTime="2026-04-16 22:14:49.854994987 +0000 UTC m=+68.979619274" watchObservedRunningTime="2026-04-16 22:14:49.855974869 +0000 UTC m=+68.980599153" Apr 16 22:15:18.227295 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:18.227266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:15:18.227749 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:18.227309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:15:18.227749 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:18.227417 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:18.227749 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:18.227435 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:18.227749 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:18.227477 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls podName:c2ffb347-b866-40c3-b690-eb1a564849ef nodeName:}" failed. No retries permitted until 2026-04-16 22:16:22.227462214 +0000 UTC m=+161.352086479 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls") pod "dns-default-sbt4r" (UID: "c2ffb347-b866-40c3-b690-eb1a564849ef") : secret "dns-default-metrics-tls" not found Apr 16 22:15:18.227749 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:18.227511 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert podName:3fede498-aa63-44a3-8ef2-e602f7ca7131 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:22.227493025 +0000 UTC m=+161.352117309 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert") pod "ingress-canary-898dq" (UID: "3fede498-aa63-44a3-8ef2-e602f7ca7131") : secret "canary-serving-cert" not found Apr 16 22:15:20.844936 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:20.844906 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kd4tx" Apr 16 22:15:50.148677 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:50.148620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:15:50.149182 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:50.148775 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:50.149182 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:50.148837 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs podName:c3eaddc5-e6c1-45aa-a952-0c7d74359e05 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:52.148822545 +0000 UTC m=+251.273446811 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs") pod "network-metrics-daemon-6dklm" (UID: "c3eaddc5-e6c1-45aa-a952-0c7d74359e05") : secret "metrics-daemon-secret" not found Apr 16 22:15:55.532570 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.532486 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5ccb67d98b-l72zk"] Apr 16 22:15:55.535810 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.535793 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz"] Apr 16 22:15:55.535955 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.535938 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.538709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.538690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.541598 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 22:15:55.541598 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541568 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 22:15:55.541772 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541569 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 22:15:55.541772 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541705 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sc7h7\"" Apr 16 22:15:55.541772 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541744 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.541916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541801 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 22:15:55.541916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.541916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.541864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.542182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.542161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 22:15:55.542277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.542161 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.542401 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.542387 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 22:15:55.542474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.542391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-q9fk9\"" Apr 16 22:15:55.546877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.546856 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz"] Apr 16 22:15:55.555970 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.555945 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5ccb67d98b-l72zk"] Apr 16 22:15:55.684749 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.684709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgqxm\" (UniqueName: \"kubernetes.io/projected/c83c75a9-9ffe-4644-b733-f725231b1b4b-kube-api-access-mgqxm\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.684914 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.684761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c75a9-9ffe-4644-b733-f725231b1b4b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.684914 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.684841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-default-certificate\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.684914 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.684891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qsw4\" (UniqueName: \"kubernetes.io/projected/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-kube-api-access-2qsw4\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.685042 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.684930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.685042 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.684972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-stats-auth\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.685042 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.685000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83c75a9-9ffe-4644-b733-f725231b1b4b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.685042 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.685014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.727204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.727167 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc"] Apr 16 22:15:55.730126 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.730106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" Apr 16 22:15:55.732436 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.732416 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9jz7r\"" Apr 16 22:15:55.736878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.736855 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc"] Apr 16 22:15:55.786174 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.786174 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-stats-auth\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.786369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83c75a9-9ffe-4644-b733-f725231b1b4b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.786369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.786369 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:55.786248 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:55.786369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgqxm\" (UniqueName: \"kubernetes.io/projected/c83c75a9-9ffe-4644-b733-f725231b1b4b-kube-api-access-mgqxm\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.786369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c75a9-9ffe-4644-b733-f725231b1b4b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.786369 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:55.786309 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.286291223 +0000 UTC m=+135.410915499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : secret "router-metrics-certs-default" not found Apr 16 22:15:55.786369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-default-certificate\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.786721 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:55.786395 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.286376274 +0000 UTC m=+135.411000555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:55.786721 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qsw4\" (UniqueName: \"kubernetes.io/projected/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-kube-api-access-2qsw4\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.786863 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.786841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c75a9-9ffe-4644-b733-f725231b1b4b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.788603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.788583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83c75a9-9ffe-4644-b733-f725231b1b4b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.788742 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.788725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-default-certificate\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.788801 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.788779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-stats-auth\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.794988 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.794962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgqxm\" (UniqueName: \"kubernetes.io/projected/c83c75a9-9ffe-4644-b733-f725231b1b4b-kube-api-access-mgqxm\") pod \"kube-storage-version-migrator-operator-6769c5d45-5ddtz\" (UID: \"c83c75a9-9ffe-4644-b733-f725231b1b4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.795156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.795141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qsw4\" (UniqueName: \"kubernetes.io/projected/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-kube-api-access-2qsw4\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:55.851570 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.851543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" Apr 16 22:15:55.887040 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.887007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fg7z\" (UniqueName: \"kubernetes.io/projected/af33378f-5a90-42ce-933e-c366ea1cbfb6-kube-api-access-9fg7z\") pod \"network-check-source-8894fc9bd-7qnvc\" (UID: \"af33378f-5a90-42ce-933e-c366ea1cbfb6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" Apr 16 22:15:55.961744 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.961717 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz"] Apr 16 22:15:55.964788 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:15:55.964764 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83c75a9_9ffe_4644_b733_f725231b1b4b.slice/crio-f363b70d54425343d3bd5e94ce8dfb3c0d35ed8c94eb07017210d389b05d32ae WatchSource:0}: Error finding container f363b70d54425343d3bd5e94ce8dfb3c0d35ed8c94eb07017210d389b05d32ae: Status 404 returned error can't find the container with id f363b70d54425343d3bd5e94ce8dfb3c0d35ed8c94eb07017210d389b05d32ae Apr 16 22:15:55.987524 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.987500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fg7z\" (UniqueName: \"kubernetes.io/projected/af33378f-5a90-42ce-933e-c366ea1cbfb6-kube-api-access-9fg7z\") pod \"network-check-source-8894fc9bd-7qnvc\" (UID: \"af33378f-5a90-42ce-933e-c366ea1cbfb6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" Apr 16 22:15:55.995915 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:55.995891 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fg7z\" (UniqueName: \"kubernetes.io/projected/af33378f-5a90-42ce-933e-c366ea1cbfb6-kube-api-access-9fg7z\") pod \"network-check-source-8894fc9bd-7qnvc\" (UID: \"af33378f-5a90-42ce-933e-c366ea1cbfb6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" Apr 16 22:15:56.039510 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.039441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" Apr 16 22:15:56.153002 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.152972 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc"] Apr 16 22:15:56.156069 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:15:56.156044 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf33378f_5a90_42ce_933e_c366ea1cbfb6.slice/crio-c8d3235b61a00de50529295521701e7bd32d1aa3b46bea6886d62459d4df73be WatchSource:0}: Error finding container c8d3235b61a00de50529295521701e7bd32d1aa3b46bea6886d62459d4df73be: Status 404 returned error can't find the container with id c8d3235b61a00de50529295521701e7bd32d1aa3b46bea6886d62459d4df73be Apr 16 22:15:56.290250 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.290159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:56.290250 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.290208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:56.290440 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:56.290318 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:56.290440 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:56.290381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:57.290361511 +0000 UTC m=+136.414985791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:56.290440 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:56.290397 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:57.290390236 +0000 UTC m=+136.415014501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : secret "router-metrics-certs-default" not found Apr 16 22:15:56.965452 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.965416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" event={"ID":"c83c75a9-9ffe-4644-b733-f725231b1b4b","Type":"ContainerStarted","Data":"f363b70d54425343d3bd5e94ce8dfb3c0d35ed8c94eb07017210d389b05d32ae"} Apr 16 22:15:56.967302 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.966704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" event={"ID":"af33378f-5a90-42ce-933e-c366ea1cbfb6","Type":"ContainerStarted","Data":"58d0b26e6d522edacc180160533d587b1e0d81846f745e20e8aec72af9a6ff6a"} Apr 16 22:15:56.967302 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.966732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" event={"ID":"af33378f-5a90-42ce-933e-c366ea1cbfb6","Type":"ContainerStarted","Data":"c8d3235b61a00de50529295521701e7bd32d1aa3b46bea6886d62459d4df73be"} Apr 16 22:15:56.981984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:56.981915 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7qnvc" podStartSLOduration=1.981898615 podStartE2EDuration="1.981898615s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:56.981625688 +0000 UTC m=+136.106249976" watchObservedRunningTime="2026-04-16 22:15:56.981898615 +0000 UTC m=+136.106522901" Apr 16 22:15:57.300005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:57.299913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:57.300005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:57.299992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:57.300228 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:57.300032 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:57.300228 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:57.300113 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:59.300092713 +0000 UTC m=+138.424716993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : secret "router-metrics-certs-default" not found Apr 16 22:15:57.300228 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:57.300144 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:59.300129157 +0000 UTC m=+138.424753428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:57.969745 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:57.969704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" event={"ID":"c83c75a9-9ffe-4644-b733-f725231b1b4b","Type":"ContainerStarted","Data":"724b6f174af8fa0bb530703bf44035784345e50b5ff56f696548b7f10a467a57"} Apr 16 22:15:57.985917 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:57.985869 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" podStartSLOduration=1.488481745 podStartE2EDuration="2.985854723s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.966445953 +0000 UTC m=+135.091070219" lastFinishedPulling="2026-04-16 22:15:57.463818925 +0000 UTC m=+136.588443197" observedRunningTime="2026-04-16 22:15:57.984355492 +0000 UTC m=+137.108979789" watchObservedRunningTime="2026-04-16 22:15:57.985854723 +0000 UTC m=+137.110479010" Apr 16 22:15:58.955469 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:58.955438 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw"] Apr 16 22:15:58.960171 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:58.960140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" Apr 16 22:15:58.963238 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:58.963215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:58.963806 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:58.963786 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kwljg\"" Apr 16 22:15:58.963917 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:58.963786 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 22:15:58.967928 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:58.967902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw"] Apr 16 22:15:59.112760 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.112723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmg9r\" (UniqueName: \"kubernetes.io/projected/2bb3aae4-cca6-473a-963f-eaafa8efc8fa-kube-api-access-bmg9r\") pod \"migrator-74bb7799d9-bx8dw\" (UID: \"2bb3aae4-cca6-473a-963f-eaafa8efc8fa\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" Apr 16 22:15:59.213354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.213244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmg9r\" (UniqueName: \"kubernetes.io/projected/2bb3aae4-cca6-473a-963f-eaafa8efc8fa-kube-api-access-bmg9r\") pod \"migrator-74bb7799d9-bx8dw\" (UID: \"2bb3aae4-cca6-473a-963f-eaafa8efc8fa\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" Apr 16 22:15:59.221529 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.221498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmg9r\" (UniqueName: \"kubernetes.io/projected/2bb3aae4-cca6-473a-963f-eaafa8efc8fa-kube-api-access-bmg9r\") pod \"migrator-74bb7799d9-bx8dw\" (UID: \"2bb3aae4-cca6-473a-963f-eaafa8efc8fa\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" Apr 16 22:15:59.269613 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.269588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" Apr 16 22:15:59.317063 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.316799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:59.317063 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.316937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:15:59.317253 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:59.317077 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:59.317253 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:59.317143 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.317123224 +0000 UTC m=+142.441747492 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : secret "router-metrics-certs-default" not found Apr 16 22:15:59.317253 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:15:59.317182 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.317162757 +0000 UTC m=+142.441787031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:59.385428 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.385399 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw"] Apr 16 22:15:59.388652 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:15:59.388621 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb3aae4_cca6_473a_963f_eaafa8efc8fa.slice/crio-21f2cd5e945156af659f005664fabdf065a91e9e5ecc74aed97e085b2fd0f5d1 WatchSource:0}: Error finding container 21f2cd5e945156af659f005664fabdf065a91e9e5ecc74aed97e085b2fd0f5d1: Status 404 returned error can't find the container with id 21f2cd5e945156af659f005664fabdf065a91e9e5ecc74aed97e085b2fd0f5d1 Apr 16 22:15:59.974533 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:15:59.974491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" event={"ID":"2bb3aae4-cca6-473a-963f-eaafa8efc8fa","Type":"ContainerStarted","Data":"21f2cd5e945156af659f005664fabdf065a91e9e5ecc74aed97e085b2fd0f5d1"} Apr 16 22:16:00.978295 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:00.978251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" event={"ID":"2bb3aae4-cca6-473a-963f-eaafa8efc8fa","Type":"ContainerStarted","Data":"0e6ecf4b08b45a6a188fc4a0722df91c9b06f157e185203e1da78d34d747562b"} Apr 16 22:16:00.978295 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:00.978293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" event={"ID":"2bb3aae4-cca6-473a-963f-eaafa8efc8fa","Type":"ContainerStarted","Data":"e1d0a0ea37f322331b0a2010e05033c2e9e1362e96b8ec260bf237b9c5d1efe2"} Apr 16 22:16:00.993360 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:00.993293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bx8dw" podStartSLOduration=1.953523853 podStartE2EDuration="2.993278285s" podCreationTimestamp="2026-04-16 22:15:58 +0000 UTC" firstStartedPulling="2026-04-16 22:15:59.390517948 +0000 UTC m=+138.515142227" lastFinishedPulling="2026-04-16 22:16:00.430272391 +0000 UTC m=+139.554896659" observedRunningTime="2026-04-16 22:16:00.993171197 +0000 UTC m=+140.117795484" watchObservedRunningTime="2026-04-16 22:16:00.993278285 +0000 UTC m=+140.117902574" Apr 16 22:16:02.008479 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.008434 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66f768b8d9-tgggs"] Apr 16 22:16:02.011424 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.011407 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.018206 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.018181 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:16:02.018354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.018210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:16:02.018354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.018236 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2kcsr\"" Apr 16 22:16:02.018354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.018240 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:16:02.022854 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.022834 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:16:02.034378 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.034350 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66f768b8d9-tgggs"] Apr 16 22:16:02.138075 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-image-registry-private-configuration\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.138252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.138252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-bound-sa-token\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.138252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-trusted-ca\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.138252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-installation-pull-secrets\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.138463 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-certificates\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.138463 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e51ed12b-ab5f-4580-b228-238d2b5b2534-ca-trust-extracted\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.138463 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.138424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwkd\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-kube-api-access-fnwkd\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.239584 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-bound-sa-token\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.239584 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-trusted-ca\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.239822 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-installation-pull-secrets\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.239822 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-certificates\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.239918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e51ed12b-ab5f-4580-b228-238d2b5b2534-ca-trust-extracted\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.239918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwkd\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-kube-api-access-fnwkd\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.240004 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-image-registry-private-configuration\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.240004 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.239995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.240144 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:02.240127 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:02.240205 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:02.240148 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f768b8d9-tgggs: secret "image-registry-tls" not found Apr 16 22:16:02.240258 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:02.240213 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls podName:e51ed12b-ab5f-4580-b228-238d2b5b2534 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:02.740197106 +0000 UTC m=+141.864821391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls") pod "image-registry-66f768b8d9-tgggs" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534") : secret "image-registry-tls" not found Apr 16 22:16:02.240258 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.240231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e51ed12b-ab5f-4580-b228-238d2b5b2534-ca-trust-extracted\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.240697 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.240678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-trusted-ca\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.240992 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.240971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-certificates\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.242156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.242135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-installation-pull-secrets\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.242262 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.242243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-image-registry-private-configuration\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.251188 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.251167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-bound-sa-token\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.251468 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.251450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwkd\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-kube-api-access-fnwkd\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.742670 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:02.742623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:02.742831 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:02.742779 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:02.742831 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:02.742797 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f768b8d9-tgggs: secret "image-registry-tls" not found Apr 16 22:16:02.742981 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:02.742864 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls podName:e51ed12b-ab5f-4580-b228-238d2b5b2534 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.742846333 +0000 UTC m=+142.867470602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls") pod "image-registry-66f768b8d9-tgggs" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534") : secret "image-registry-tls" not found Apr 16 22:16:03.348640 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:03.348608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:03.348989 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:03.348661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:03.348989 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:03.348755 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:11.348739286 +0000 UTC m=+150.473363554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : configmap references non-existent config key: service-ca.crt Apr 16 22:16:03.348989 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:03.348793 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:16:03.348989 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:03.348852 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs podName:91f82d6f-53a5-432d-b3a4-03bdde2f00e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:11.348841562 +0000 UTC m=+150.473465826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs") pod "router-default-5ccb67d98b-l72zk" (UID: "91f82d6f-53a5-432d-b3a4-03bdde2f00e5") : secret "router-metrics-certs-default" not found Apr 16 22:16:03.372022 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:03.372001 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58875_7a33450b-5146-4096-a0eb-79767266b790/dns-node-resolver/0.log" Apr 16 22:16:03.751334 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:03.751298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:03.751496 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:03.751444 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:03.751496 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:03.751466 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f768b8d9-tgggs: secret "image-registry-tls" not found Apr 16 22:16:03.751571 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:03.751518 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls podName:e51ed12b-ab5f-4580-b228-238d2b5b2534 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:05.751504625 +0000 UTC m=+144.876128891 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls") pod "image-registry-66f768b8d9-tgggs" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534") : secret "image-registry-tls" not found Apr 16 22:16:04.596464 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:04.596442 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t27sn_534a804f-de9c-430e-9e0a-47849b4977da/node-ca/0.log" Apr 16 22:16:05.578557 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:05.578519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bx8dw_2bb3aae4-cca6-473a-963f-eaafa8efc8fa/migrator/0.log" Apr 16 22:16:05.767637 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:05.767597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:05.767992 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:05.767723 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:05.767992 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:05.767735 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f768b8d9-tgggs: secret "image-registry-tls" not found Apr 16 22:16:05.767992 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:05.767795 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls podName:e51ed12b-ab5f-4580-b228-238d2b5b2534 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:09.767782193 +0000 UTC m=+148.892406458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls") pod "image-registry-66f768b8d9-tgggs" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534") : secret "image-registry-tls" not found Apr 16 22:16:05.775784 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:05.775762 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bx8dw_2bb3aae4-cca6-473a-963f-eaafa8efc8fa/graceful-termination/0.log" Apr 16 22:16:05.978581 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:05.978554 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-5ddtz_c83c75a9-9ffe-4644-b733-f725231b1b4b/kube-storage-version-migrator-operator/0.log" Apr 16 22:16:09.795985 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:09.795953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:09.798046 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:09.798025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"image-registry-66f768b8d9-tgggs\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:09.820256 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:09.820230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:09.943737 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:09.943703 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66f768b8d9-tgggs"] Apr 16 22:16:09.946781 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:09.946753 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51ed12b_ab5f_4580_b228_238d2b5b2534.slice/crio-cee9c794d1a8a79c0cfebf4653c5286db17295d28ed4a248e8dddaa7487e728b WatchSource:0}: Error finding container cee9c794d1a8a79c0cfebf4653c5286db17295d28ed4a248e8dddaa7487e728b: Status 404 returned error can't find the container with id cee9c794d1a8a79c0cfebf4653c5286db17295d28ed4a248e8dddaa7487e728b Apr 16 22:16:10.001061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:10.001033 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" event={"ID":"e51ed12b-ab5f-4580-b228-238d2b5b2534","Type":"ContainerStarted","Data":"cee9c794d1a8a79c0cfebf4653c5286db17295d28ed4a248e8dddaa7487e728b"} Apr 16 22:16:11.005208 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.005168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" event={"ID":"e51ed12b-ab5f-4580-b228-238d2b5b2534","Type":"ContainerStarted","Data":"01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3"} Apr 16 22:16:11.005624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.005310 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:11.024623 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.024579 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" podStartSLOduration=10.024564638 podStartE2EDuration="10.024564638s" podCreationTimestamp="2026-04-16 22:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:11.023636597 +0000 UTC m=+150.148260884" watchObservedRunningTime="2026-04-16 22:16:11.024564638 +0000 UTC m=+150.149188926" Apr 16 22:16:11.407654 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.407556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:11.407654 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.407646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:11.408344 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.408305 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-service-ca-bundle\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:11.410485 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.410452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f82d6f-53a5-432d-b3a4-03bdde2f00e5-metrics-certs\") pod \"router-default-5ccb67d98b-l72zk\" (UID: \"91f82d6f-53a5-432d-b3a4-03bdde2f00e5\") " pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:11.446051 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.446024 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:11.564933 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:11.564902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5ccb67d98b-l72zk"] Apr 16 22:16:11.568343 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:11.568299 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f82d6f_53a5_432d_b3a4_03bdde2f00e5.slice/crio-919373383e62db1f98513bf66fc913b2fa7b74b9520e3c76aa21eb3612eb4f7a WatchSource:0}: Error finding container 919373383e62db1f98513bf66fc913b2fa7b74b9520e3c76aa21eb3612eb4f7a: Status 404 returned error can't find the container with id 919373383e62db1f98513bf66fc913b2fa7b74b9520e3c76aa21eb3612eb4f7a Apr 16 22:16:12.009508 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:12.009472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" event={"ID":"91f82d6f-53a5-432d-b3a4-03bdde2f00e5","Type":"ContainerStarted","Data":"084ab8ce09d1db6c48c0316c1806ca1043293bdb994a7a0b871aabae10bd7e60"} Apr 16 22:16:12.009508 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:12.009512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" event={"ID":"91f82d6f-53a5-432d-b3a4-03bdde2f00e5","Type":"ContainerStarted","Data":"919373383e62db1f98513bf66fc913b2fa7b74b9520e3c76aa21eb3612eb4f7a"} Apr 16 22:16:12.028836 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:12.028784 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" podStartSLOduration=17.028771585 podStartE2EDuration="17.028771585s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:12.028169016 +0000 UTC m=+151.152793297" watchObservedRunningTime="2026-04-16 22:16:12.028771585 +0000 UTC m=+151.153395907" Apr 16 22:16:12.447229 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:12.447201 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:12.449590 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:12.449568 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:13.012363 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:13.012315 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:13.013520 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:13.013497 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5ccb67d98b-l72zk" Apr 16 22:16:17.328710 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:17.328653 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-898dq" podUID="3fede498-aa63-44a3-8ef2-e602f7ca7131" Apr 16 22:16:17.348938 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:17.348911 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sbt4r" podUID="c2ffb347-b866-40c3-b690-eb1a564849ef" Apr 16 22:16:17.471608 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:17.471575 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6dklm" podUID="c3eaddc5-e6c1-45aa-a952-0c7d74359e05" Apr 16 22:16:18.024017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:18.023989 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:16:18.024159 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:18.024057 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sbt4r" Apr 16 22:16:22.285962 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.285922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:16:22.286470 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.285995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:16:22.288555 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.288531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fede498-aa63-44a3-8ef2-e602f7ca7131-cert\") pod \"ingress-canary-898dq\" (UID: \"3fede498-aa63-44a3-8ef2-e602f7ca7131\") " pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:16:22.288653 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.288574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2ffb347-b866-40c3-b690-eb1a564849ef-metrics-tls\") pod \"dns-default-sbt4r\" (UID: \"c2ffb347-b866-40c3-b690-eb1a564849ef\") " pod="openshift-dns/dns-default-sbt4r" Apr 16 22:16:22.528304 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.528273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bd46\"" Apr 16 22:16:22.528304 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.528275 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6fpb6\"" Apr 16 22:16:22.535353 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.535321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-898dq" Apr 16 22:16:22.535400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.535356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sbt4r" Apr 16 22:16:22.677419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.677397 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sbt4r"] Apr 16 22:16:22.679197 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:22.679158 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ffb347_b866_40c3_b690_eb1a564849ef.slice/crio-d70d0560642562d98e74b62dda243afc08a5d653e30a330d65151f509223c26a WatchSource:0}: Error finding container d70d0560642562d98e74b62dda243afc08a5d653e30a330d65151f509223c26a: Status 404 returned error can't find the container with id d70d0560642562d98e74b62dda243afc08a5d653e30a330d65151f509223c26a Apr 16 22:16:22.689372 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:22.689351 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-898dq"] Apr 16 22:16:22.692692 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:22.692667 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fede498_aa63_44a3_8ef2_e602f7ca7131.slice/crio-805e174282887dd1e9edb59564074ab5e4fd1ae07927cd07b32f3cfbd716f255 WatchSource:0}: Error finding container 805e174282887dd1e9edb59564074ab5e4fd1ae07927cd07b32f3cfbd716f255: Status 404 returned error can't find the container with id 805e174282887dd1e9edb59564074ab5e4fd1ae07927cd07b32f3cfbd716f255 Apr 16 22:16:23.038000 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:23.037953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-898dq" event={"ID":"3fede498-aa63-44a3-8ef2-e602f7ca7131","Type":"ContainerStarted","Data":"805e174282887dd1e9edb59564074ab5e4fd1ae07927cd07b32f3cfbd716f255"} Apr 16 22:16:23.038981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:23.038943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sbt4r" event={"ID":"c2ffb347-b866-40c3-b690-eb1a564849ef","Type":"ContainerStarted","Data":"d70d0560642562d98e74b62dda243afc08a5d653e30a330d65151f509223c26a"} Apr 16 22:16:25.045001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.044964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sbt4r" event={"ID":"c2ffb347-b866-40c3-b690-eb1a564849ef","Type":"ContainerStarted","Data":"265f8b47ae8a7446941fa8edbc126665a2d3489e849ac9e53dbe9f5370494e7a"} Apr 16 22:16:25.045437 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.045010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sbt4r" event={"ID":"c2ffb347-b866-40c3-b690-eb1a564849ef","Type":"ContainerStarted","Data":"0d28936195ee9054ade06921c03305c68e3fe213f2fd4748c01fd886419fd66a"} Apr 16 22:16:25.045437 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.045082 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sbt4r" Apr 16 22:16:25.046187 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.046170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-898dq" event={"ID":"3fede498-aa63-44a3-8ef2-e602f7ca7131","Type":"ContainerStarted","Data":"3de042065c39ab8c4d45a6c6091980ddf42064ca5aaf2afd3e3fe6ee616d8844"} Apr 16 22:16:25.063929 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.063881 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sbt4r" podStartSLOduration=129.246203545 podStartE2EDuration="2m11.063866835s" podCreationTimestamp="2026-04-16 22:14:14 +0000 UTC" firstStartedPulling="2026-04-16 22:16:22.680922454 +0000 UTC m=+161.805546724" lastFinishedPulling="2026-04-16 22:16:24.498585746 +0000 UTC m=+163.623210014" observedRunningTime="2026-04-16 22:16:25.063249972 +0000 UTC m=+164.187874261" watchObservedRunningTime="2026-04-16 22:16:25.063866835 +0000 UTC m=+164.188491122" Apr 16 22:16:25.083026 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.082984 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-898dq" podStartSLOduration=129.281109589 podStartE2EDuration="2m11.082969954s" podCreationTimestamp="2026-04-16 22:14:14 +0000 UTC" firstStartedPulling="2026-04-16 22:16:22.694669652 +0000 UTC m=+161.819293920" lastFinishedPulling="2026-04-16 22:16:24.496530006 +0000 UTC m=+163.621154285" observedRunningTime="2026-04-16 22:16:25.082254676 +0000 UTC m=+164.206878965" watchObservedRunningTime="2026-04-16 22:16:25.082969954 +0000 UTC m=+164.207594242" Apr 16 22:16:25.331460 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.331385 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66f768b8d9-tgggs"] Apr 16 22:16:25.336499 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.336466 2576 patch_prober.go:28] interesting pod/image-registry-66f768b8d9-tgggs container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:25.336630 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.336513 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" podUID="e51ed12b-ab5f-4580-b228-238d2b5b2534" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:25.371245 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.371220 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xwqzs"] Apr 16 22:16:25.374481 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.374454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.377974 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.377954 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bpg4g\"" Apr 16 22:16:25.378450 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.378434 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:16:25.378523 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.378482 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:25.378674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.378658 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:16:25.382785 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.382761 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:25.393838 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.393819 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xwqzs"] Apr 16 22:16:25.413646 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.413626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tb4n\" (UniqueName: \"kubernetes.io/projected/9b4e0565-78b4-4d03-a01b-ce6b39a81446-kube-api-access-9tb4n\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.413746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.413666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b4e0565-78b4-4d03-a01b-ce6b39a81446-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.413746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.413698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b4e0565-78b4-4d03-a01b-ce6b39a81446-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.413817 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.413758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b4e0565-78b4-4d03-a01b-ce6b39a81446-crio-socket\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.413817 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.413784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b4e0565-78b4-4d03-a01b-ce6b39a81446-data-volume\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.514746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.514709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b4e0565-78b4-4d03-a01b-ce6b39a81446-crio-socket\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.514746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.514738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b4e0565-78b4-4d03-a01b-ce6b39a81446-data-volume\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.514925 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.514768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tb4n\" (UniqueName: \"kubernetes.io/projected/9b4e0565-78b4-4d03-a01b-ce6b39a81446-kube-api-access-9tb4n\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.514925 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.514794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b4e0565-78b4-4d03-a01b-ce6b39a81446-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.514925 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.514824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b4e0565-78b4-4d03-a01b-ce6b39a81446-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.514925 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.514829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b4e0565-78b4-4d03-a01b-ce6b39a81446-crio-socket\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.515069 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.515051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b4e0565-78b4-4d03-a01b-ce6b39a81446-data-volume\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.515382 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.515353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b4e0565-78b4-4d03-a01b-ce6b39a81446-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.517054 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.517035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b4e0565-78b4-4d03-a01b-ce6b39a81446-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.528266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.528246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tb4n\" (UniqueName: \"kubernetes.io/projected/9b4e0565-78b4-4d03-a01b-ce6b39a81446-kube-api-access-9tb4n\") pod \"insights-runtime-extractor-xwqzs\" (UID: \"9b4e0565-78b4-4d03-a01b-ce6b39a81446\") " pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.684180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.684090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xwqzs" Apr 16 22:16:25.854882 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:25.854856 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xwqzs"] Apr 16 22:16:25.856879 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:25.856849 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4e0565_78b4_4d03_a01b_ce6b39a81446.slice/crio-34b1adf4b9d0204f79bad4904f5d96100d058bc40826407b285a478dfacfe9a5 WatchSource:0}: Error finding container 34b1adf4b9d0204f79bad4904f5d96100d058bc40826407b285a478dfacfe9a5: Status 404 returned error can't find the container with id 34b1adf4b9d0204f79bad4904f5d96100d058bc40826407b285a478dfacfe9a5 Apr 16 22:16:26.049755 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:26.049724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwqzs" event={"ID":"9b4e0565-78b4-4d03-a01b-ce6b39a81446","Type":"ContainerStarted","Data":"e800d93f498d2877e0722cfb05c8dd04062203c5dc9570a9a402e6a9cc525b7d"} Apr 16 22:16:26.049755 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:26.049759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwqzs" event={"ID":"9b4e0565-78b4-4d03-a01b-ce6b39a81446","Type":"ContainerStarted","Data":"34b1adf4b9d0204f79bad4904f5d96100d058bc40826407b285a478dfacfe9a5"} Apr 16 22:16:27.053558 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:27.053527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwqzs" event={"ID":"9b4e0565-78b4-4d03-a01b-ce6b39a81446","Type":"ContainerStarted","Data":"617801f9a6dd374d711b1700e876059f89fa60415f3bd48a4d3e4923b78494de"} Apr 16 22:16:28.057295 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:28.057259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xwqzs" event={"ID":"9b4e0565-78b4-4d03-a01b-ce6b39a81446","Type":"ContainerStarted","Data":"a78e6e1d6061ec82c2cfd378a596aeb5042524bfa2d9de4f698feb474d6a8972"} Apr 16 22:16:28.076301 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:28.076255 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xwqzs" podStartSLOduration=1.149454435 podStartE2EDuration="3.076241472s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:25.909005257 +0000 UTC m=+165.033629522" lastFinishedPulling="2026-04-16 22:16:27.835792294 +0000 UTC m=+166.960416559" observedRunningTime="2026-04-16 22:16:28.074946657 +0000 UTC m=+167.199570944" watchObservedRunningTime="2026-04-16 22:16:28.076241472 +0000 UTC m=+167.200865736" Apr 16 22:16:32.448419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:32.448321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:16:34.392221 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.392188 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s2ssf"] Apr 16 22:16:34.395296 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.395279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.398484 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.398462 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 22:16:34.398592 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.398553 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 22:16:34.399813 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.399795 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gtczd\"" Apr 16 22:16:34.399813 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.399807 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:16:34.399940 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.399833 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:34.399940 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.399837 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:16:34.410617 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.410591 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s2ssf"] Apr 16 22:16:34.482638 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.482600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dgx\" (UniqueName: \"kubernetes.io/projected/5f4172c2-c04a-49df-b80d-bc8d1949aaca-kube-api-access-92dgx\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.482638 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.482635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.482868 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.482720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f4172c2-c04a-49df-b80d-bc8d1949aaca-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.482868 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.482794 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.583151 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.583113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92dgx\" (UniqueName: \"kubernetes.io/projected/5f4172c2-c04a-49df-b80d-bc8d1949aaca-kube-api-access-92dgx\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.583151 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.583151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.583439 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.583188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f4172c2-c04a-49df-b80d-bc8d1949aaca-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.583439 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:34.583278 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 22:16:34.583439 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:34.583347 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls podName:5f4172c2-c04a-49df-b80d-bc8d1949aaca nodeName:}" failed. No retries permitted until 2026-04-16 22:16:35.083309512 +0000 UTC m=+174.207933776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-s2ssf" (UID: "5f4172c2-c04a-49df-b80d-bc8d1949aaca") : secret "prometheus-operator-tls" not found Apr 16 22:16:34.583439 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.583386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.583927 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.583905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f4172c2-c04a-49df-b80d-bc8d1949aaca-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.585697 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.585675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:34.592267 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:34.592239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92dgx\" (UniqueName: \"kubernetes.io/projected/5f4172c2-c04a-49df-b80d-bc8d1949aaca-kube-api-access-92dgx\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:35.052439 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:35.052410 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sbt4r" Apr 16 22:16:35.088376 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:35.088245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:35.088563 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:35.088386 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 22:16:35.088563 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:35.088460 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls podName:5f4172c2-c04a-49df-b80d-bc8d1949aaca nodeName:}" failed. No retries permitted until 2026-04-16 22:16:36.088440352 +0000 UTC m=+175.213064617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-s2ssf" (UID: "5f4172c2-c04a-49df-b80d-bc8d1949aaca") : secret "prometheus-operator-tls" not found Apr 16 22:16:35.335703 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:35.335624 2576 patch_prober.go:28] interesting pod/image-registry-66f768b8d9-tgggs container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:35.335703 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:35.335691 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" podUID="e51ed12b-ab5f-4580-b228-238d2b5b2534" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:36.097642 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.097608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:36.100017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.099982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f4172c2-c04a-49df-b80d-bc8d1949aaca-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s2ssf\" (UID: \"5f4172c2-c04a-49df-b80d-bc8d1949aaca\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:36.203999 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.203965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" Apr 16 22:16:36.211837 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.211816 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77fd5bfc6c-c6v9b"] Apr 16 22:16:36.216125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.216105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.219041 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.219020 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:36.219145 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.219048 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:36.219145 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.219021 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:36.220076 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.220053 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-v9r8c\"" Apr 16 22:16:36.220164 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.220101 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:36.220164 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.220062 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:36.220543 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.220528 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:36.220684 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.220668 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:36.224697 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.224673 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:16:36.225776 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.225278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77fd5bfc6c-c6v9b"] Apr 16 22:16:36.300003 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.299960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-oauth-config\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.300003 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.299996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-oauth-serving-cert\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.300233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.300018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-console-config\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.300233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.300038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-serving-cert\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.300233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.300139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-trusted-ca-bundle\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.300233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.300183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-service-ca\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.300233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.300206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cbsr\" (UniqueName: \"kubernetes.io/projected/c79d225c-5a16-4137-b10d-66f4e314492f-kube-api-access-6cbsr\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.327087 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.327058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s2ssf"] Apr 16 22:16:36.330178 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:36.330146 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f4172c2_c04a_49df_b80d_bc8d1949aaca.slice/crio-fab4add8a8de3107b5180a6d37ca70ebbd574aba09bbb445a4470faa698a6f76 WatchSource:0}: Error finding container fab4add8a8de3107b5180a6d37ca70ebbd574aba09bbb445a4470faa698a6f76: Status 404 returned error can't find the container with id fab4add8a8de3107b5180a6d37ca70ebbd574aba09bbb445a4470faa698a6f76 Apr 16 22:16:36.401444 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.401353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-oauth-config\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.401444 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.401392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-oauth-serving-cert\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.401444 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.401426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-console-config\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.401700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.401455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-serving-cert\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.401700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.401501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-trusted-ca-bundle\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.401700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.401539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-service-ca\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.401700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.401562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cbsr\" (UniqueName: \"kubernetes.io/projected/c79d225c-5a16-4137-b10d-66f4e314492f-kube-api-access-6cbsr\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.402229 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.402196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-oauth-serving-cert\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.402316 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.402209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-service-ca\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.402316 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.402208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-console-config\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.402454 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.402375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-trusted-ca-bundle\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.403988 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.403964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-serving-cert\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.404120 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.404017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-oauth-config\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.411097 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.411077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cbsr\" (UniqueName: \"kubernetes.io/projected/c79d225c-5a16-4137-b10d-66f4e314492f-kube-api-access-6cbsr\") pod \"console-77fd5bfc6c-c6v9b\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.528356 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.528305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:36.638876 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:36.638837 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77fd5bfc6c-c6v9b"] Apr 16 22:16:36.642189 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:36.642163 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79d225c_5a16_4137_b10d_66f4e314492f.slice/crio-4448c56803fd44a6d9f04ab2086644af68f426ace56c698947de9566b91b0c37 WatchSource:0}: Error finding container 4448c56803fd44a6d9f04ab2086644af68f426ace56c698947de9566b91b0c37: Status 404 returned error can't find the container with id 4448c56803fd44a6d9f04ab2086644af68f426ace56c698947de9566b91b0c37 Apr 16 22:16:37.080995 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:37.080953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fd5bfc6c-c6v9b" event={"ID":"c79d225c-5a16-4137-b10d-66f4e314492f","Type":"ContainerStarted","Data":"4448c56803fd44a6d9f04ab2086644af68f426ace56c698947de9566b91b0c37"} Apr 16 22:16:37.082084 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:37.082055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" event={"ID":"5f4172c2-c04a-49df-b80d-bc8d1949aaca","Type":"ContainerStarted","Data":"fab4add8a8de3107b5180a6d37ca70ebbd574aba09bbb445a4470faa698a6f76"} Apr 16 22:16:37.812128 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:37.812083 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" podUID="38bedc04-9717-48cb-bbe2-f7b213757480" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 16 22:16:38.086183 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:38.086090 2576 generic.go:358] "Generic (PLEG): container finished" podID="38bedc04-9717-48cb-bbe2-f7b213757480" containerID="60998fd019ba650a6a8c3b29b928a34f571d1366a15261e5061078aa77842e11" exitCode=1 Apr 16 22:16:38.086373 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:38.086174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" event={"ID":"38bedc04-9717-48cb-bbe2-f7b213757480","Type":"ContainerDied","Data":"60998fd019ba650a6a8c3b29b928a34f571d1366a15261e5061078aa77842e11"} Apr 16 22:16:38.086624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:38.086604 2576 scope.go:117] "RemoveContainer" containerID="60998fd019ba650a6a8c3b29b928a34f571d1366a15261e5061078aa77842e11" Apr 16 22:16:38.088154 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:38.088126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" event={"ID":"5f4172c2-c04a-49df-b80d-bc8d1949aaca","Type":"ContainerStarted","Data":"dc5f2f444f81ed0dbd896e68ec931e38d0f943b515ef33fee29da96994ae3195"} Apr 16 22:16:38.088288 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:38.088161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" event={"ID":"5f4172c2-c04a-49df-b80d-bc8d1949aaca","Type":"ContainerStarted","Data":"0c350aff280ed0f266195c5fadabda955353f5e1aec92e532c9bd5c1442baaaa"} Apr 16 22:16:38.120273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:38.120038 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-s2ssf" podStartSLOduration=3.074359567 podStartE2EDuration="4.12002043s" podCreationTimestamp="2026-04-16 22:16:34 +0000 UTC" firstStartedPulling="2026-04-16 22:16:36.332131541 +0000 UTC m=+175.456755805" lastFinishedPulling="2026-04-16 22:16:37.377792403 +0000 UTC m=+176.502416668" observedRunningTime="2026-04-16 22:16:38.119970685 +0000 UTC m=+177.244594971" watchObservedRunningTime="2026-04-16 22:16:38.12002043 +0000 UTC m=+177.244644721" Apr 16 22:16:39.094203 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:39.094155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" event={"ID":"38bedc04-9717-48cb-bbe2-f7b213757480","Type":"ContainerStarted","Data":"e0e4dbcbdc0189740accfdce81d3638f2912f2a062677cbd394a92fc156e46b5"} Apr 16 22:16:39.094695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:39.094605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:16:39.095360 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:39.095316 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d8d748b5b-m2jl2" Apr 16 22:16:40.070711 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.070623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-774dh"] Apr 16 22:16:40.074031 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.074009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.080091 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.080065 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:40.080779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.080549 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gp7ks\"" Apr 16 22:16:40.080779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.080771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:40.080927 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.080819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:40.097700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.097670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fd5bfc6c-c6v9b" event={"ID":"c79d225c-5a16-4137-b10d-66f4e314492f","Type":"ContainerStarted","Data":"400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536"} Apr 16 22:16:40.137122 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137066 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77fd5bfc6c-c6v9b" podStartSLOduration=1.07202191 podStartE2EDuration="4.137050871s" podCreationTimestamp="2026-04-16 22:16:36 +0000 UTC" firstStartedPulling="2026-04-16 22:16:36.644001685 +0000 UTC m=+175.768625950" lastFinishedPulling="2026-04-16 22:16:39.709030634 +0000 UTC m=+178.833654911" observedRunningTime="2026-04-16 22:16:40.136310387 +0000 UTC m=+179.260934686" watchObservedRunningTime="2026-04-16 22:16:40.137050871 +0000 UTC m=+179.261675157" Apr 16 22:16:40.137509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-sys\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137595 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fa9245e-fc11-4e0b-9c31-017700afca37-metrics-client-ca\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137595 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137704 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-wtmp\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137704 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-tls\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137786 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-root\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137786 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82mp\" (UniqueName: \"kubernetes.io/projected/7fa9245e-fc11-4e0b-9c31-017700afca37-kube-api-access-h82mp\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137786 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-textfile\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.137899 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.137807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-accelerators-collector-config\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.238797 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.238757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-root\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.238998 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.238816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h82mp\" (UniqueName: \"kubernetes.io/projected/7fa9245e-fc11-4e0b-9c31-017700afca37-kube-api-access-h82mp\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.238998 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.238892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-root\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.238998 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.238943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-textfile\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239151 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-accelerators-collector-config\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239202 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-sys\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239254 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-sys\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239304 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-textfile\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239393 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fa9245e-fc11-4e0b-9c31-017700afca37-metrics-client-ca\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239393 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239534 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-wtmp\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239661 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-tls\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239737 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:40.239722 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:40.239737 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-accelerators-collector-config\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.239840 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:40.239778 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-tls podName:7fa9245e-fc11-4e0b-9c31-017700afca37 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:40.739760974 +0000 UTC m=+179.864385254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-tls") pod "node-exporter-774dh" (UID: "7fa9245e-fc11-4e0b-9c31-017700afca37") : secret "node-exporter-tls" not found Apr 16 22:16:40.239840 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.239781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-wtmp\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.240364 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.240343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fa9245e-fc11-4e0b-9c31-017700afca37-metrics-client-ca\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.241834 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.241817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.251362 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.251319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82mp\" (UniqueName: \"kubernetes.io/projected/7fa9245e-fc11-4e0b-9c31-017700afca37-kube-api-access-h82mp\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.743104 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.743069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-tls\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.745367 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.745339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7fa9245e-fc11-4e0b-9c31-017700afca37-node-exporter-tls\") pod \"node-exporter-774dh\" (UID: \"7fa9245e-fc11-4e0b-9c31-017700afca37\") " pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.983901 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:40.983863 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-774dh" Apr 16 22:16:40.991722 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:16:40.991693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa9245e_fc11_4e0b_9c31_017700afca37.slice/crio-d39bd46e716dc003f795ab19891131e5f486b7d754a49117c467fbd1436f1fa1 WatchSource:0}: Error finding container d39bd46e716dc003f795ab19891131e5f486b7d754a49117c467fbd1436f1fa1: Status 404 returned error can't find the container with id d39bd46e716dc003f795ab19891131e5f486b7d754a49117c467fbd1436f1fa1 Apr 16 22:16:41.100921 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:41.100878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-774dh" event={"ID":"7fa9245e-fc11-4e0b-9c31-017700afca37","Type":"ContainerStarted","Data":"d39bd46e716dc003f795ab19891131e5f486b7d754a49117c467fbd1436f1fa1"} Apr 16 22:16:42.105352 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:42.105245 2576 generic.go:358] "Generic (PLEG): container finished" podID="7fa9245e-fc11-4e0b-9c31-017700afca37" containerID="523348d82f8b761ec5c465b7e81a82ac2161902ee20568ca699b2a0ede09563e" exitCode=0 Apr 16 22:16:42.105352 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:42.105314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-774dh" event={"ID":"7fa9245e-fc11-4e0b-9c31-017700afca37","Type":"ContainerDied","Data":"523348d82f8b761ec5c465b7e81a82ac2161902ee20568ca699b2a0ede09563e"} Apr 16 22:16:43.109955 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:43.109922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-774dh" event={"ID":"7fa9245e-fc11-4e0b-9c31-017700afca37","Type":"ContainerStarted","Data":"7b80b149edb37e3768dbc00267ddbc26b36cc008d16cf61bf3725af2787e7ae7"} Apr 16 22:16:43.109955 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:43.109957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-774dh" event={"ID":"7fa9245e-fc11-4e0b-9c31-017700afca37","Type":"ContainerStarted","Data":"eb2ea527a5720880135de8004e4ca0542c61b10779acfdf59eb8881169968b1c"} Apr 16 22:16:43.133574 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:43.133523 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-774dh" podStartSLOduration=2.42241011 podStartE2EDuration="3.133508511s" podCreationTimestamp="2026-04-16 22:16:40 +0000 UTC" firstStartedPulling="2026-04-16 22:16:40.993281652 +0000 UTC m=+180.117905917" lastFinishedPulling="2026-04-16 22:16:41.704380049 +0000 UTC m=+180.829004318" observedRunningTime="2026-04-16 22:16:43.131935745 +0000 UTC m=+182.256560031" watchObservedRunningTime="2026-04-16 22:16:43.133508511 +0000 UTC m=+182.258132798" Apr 16 22:16:44.673988 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:44.673954 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77fd5bfc6c-c6v9b"] Apr 16 22:16:45.335845 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:45.335813 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:46.529313 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:46.529274 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:16:50.350273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.350234 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" podUID="e51ed12b-ab5f-4580-b228-238d2b5b2534" containerName="registry" containerID="cri-o://01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3" gracePeriod=30 Apr 16 22:16:50.584236 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.584207 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:50.724439 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724401 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-image-registry-private-configuration\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.724439 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724452 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.724678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724473 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-certificates\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.724678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724502 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnwkd\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-kube-api-access-fnwkd\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.724678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724525 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-bound-sa-token\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.724678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724540 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e51ed12b-ab5f-4580-b228-238d2b5b2534-ca-trust-extracted\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.724678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-installation-pull-secrets\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.724678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.724606 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-trusted-ca\") pod \"e51ed12b-ab5f-4580-b228-238d2b5b2534\" (UID: \"e51ed12b-ab5f-4580-b228-238d2b5b2534\") " Apr 16 22:16:50.725232 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.725174 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:50.725377 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.725275 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:50.727385 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.727311 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:50.727489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.727415 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:50.727489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.727434 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:50.727489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.727445 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-kube-api-access-fnwkd" (OuterVolumeSpecName: "kube-api-access-fnwkd") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "kube-api-access-fnwkd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:50.727604 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.727578 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:50.733377 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.733350 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51ed12b-ab5f-4580-b228-238d2b5b2534-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e51ed12b-ab5f-4580-b228-238d2b5b2534" (UID: "e51ed12b-ab5f-4580-b228-238d2b5b2534"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:16:50.825695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825646 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-image-registry-private-configuration\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:50.825695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825681 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:50.825695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825693 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-registry-certificates\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:50.825695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825704 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnwkd\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-kube-api-access-fnwkd\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:50.826049 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825713 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e51ed12b-ab5f-4580-b228-238d2b5b2534-bound-sa-token\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:50.826049 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825722 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e51ed12b-ab5f-4580-b228-238d2b5b2534-ca-trust-extracted\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:50.826049 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825729 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e51ed12b-ab5f-4580-b228-238d2b5b2534-installation-pull-secrets\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:50.826049 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:50.825738 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e51ed12b-ab5f-4580-b228-238d2b5b2534-trusted-ca\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.131368 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.131270 2576 generic.go:358] "Generic (PLEG): container finished" podID="e51ed12b-ab5f-4580-b228-238d2b5b2534" containerID="01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3" exitCode=0 Apr 16 22:16:51.131368 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.131353 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" Apr 16 22:16:51.131553 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.131360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" event={"ID":"e51ed12b-ab5f-4580-b228-238d2b5b2534","Type":"ContainerDied","Data":"01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3"} Apr 16 22:16:51.131553 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.131401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f768b8d9-tgggs" event={"ID":"e51ed12b-ab5f-4580-b228-238d2b5b2534","Type":"ContainerDied","Data":"cee9c794d1a8a79c0cfebf4653c5286db17295d28ed4a248e8dddaa7487e728b"} Apr 16 22:16:51.131553 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.131417 2576 scope.go:117] "RemoveContainer" containerID="01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3" Apr 16 22:16:51.143648 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.143620 2576 scope.go:117] "RemoveContainer" containerID="01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3" Apr 16 22:16:51.143984 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:16:51.143964 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3\": container with ID starting with 01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3 not found: ID does not exist" containerID="01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3" Apr 16 22:16:51.144040 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.143993 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3"} err="failed to get container status \"01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3\": rpc error: code = NotFound desc = could not find container \"01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3\": container with ID starting with 01101a7a641e6856f4269f8710cd9d48e3bf8f4a2e646cbc6fec0bc2a5b252d3 not found: ID does not exist" Apr 16 22:16:51.158628 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.158602 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66f768b8d9-tgggs"] Apr 16 22:16:51.166893 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.166871 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66f768b8d9-tgggs"] Apr 16 22:16:51.452607 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:16:51.452575 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51ed12b-ab5f-4580-b228-238d2b5b2534" path="/var/lib/kubelet/pods/e51ed12b-ab5f-4580-b228-238d2b5b2534/volumes" Apr 16 22:17:03.164690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:03.164657 2576 generic.go:358] "Generic (PLEG): container finished" podID="c83c75a9-9ffe-4644-b733-f725231b1b4b" containerID="724b6f174af8fa0bb530703bf44035784345e50b5ff56f696548b7f10a467a57" exitCode=0 Apr 16 22:17:03.165089 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:03.164724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" event={"ID":"c83c75a9-9ffe-4644-b733-f725231b1b4b","Type":"ContainerDied","Data":"724b6f174af8fa0bb530703bf44035784345e50b5ff56f696548b7f10a467a57"} Apr 16 22:17:03.165089 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:03.165039 2576 scope.go:117] "RemoveContainer" containerID="724b6f174af8fa0bb530703bf44035784345e50b5ff56f696548b7f10a467a57" Apr 16 22:17:04.168623 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:04.168586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-5ddtz" event={"ID":"c83c75a9-9ffe-4644-b733-f725231b1b4b","Type":"ContainerStarted","Data":"89cde9f692a684dd73a1630965bf98cc147e35ba38ad7a21741912713e3d3482"} Apr 16 22:17:09.692350 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.692280 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77fd5bfc6c-c6v9b" podUID="c79d225c-5a16-4137-b10d-66f4e314492f" containerName="console" containerID="cri-o://400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536" gracePeriod=15 Apr 16 22:17:09.927500 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.927480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77fd5bfc6c-c6v9b_c79d225c-5a16-4137-b10d-66f4e314492f/console/0.log" Apr 16 22:17:09.927602 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.927537 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:17:09.979314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979247 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cbsr\" (UniqueName: \"kubernetes.io/projected/c79d225c-5a16-4137-b10d-66f4e314492f-kube-api-access-6cbsr\") pod \"c79d225c-5a16-4137-b10d-66f4e314492f\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " Apr 16 22:17:09.979314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979287 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-service-ca\") pod \"c79d225c-5a16-4137-b10d-66f4e314492f\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " Apr 16 22:17:09.979314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979305 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-serving-cert\") pod \"c79d225c-5a16-4137-b10d-66f4e314492f\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " Apr 16 22:17:09.979568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979355 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-console-config\") pod \"c79d225c-5a16-4137-b10d-66f4e314492f\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " Apr 16 22:17:09.979568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979370 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-trusted-ca-bundle\") pod \"c79d225c-5a16-4137-b10d-66f4e314492f\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " Apr 16 22:17:09.979568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979399 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-oauth-config\") pod \"c79d225c-5a16-4137-b10d-66f4e314492f\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " Apr 16 22:17:09.979568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979415 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-oauth-serving-cert\") pod \"c79d225c-5a16-4137-b10d-66f4e314492f\" (UID: \"c79d225c-5a16-4137-b10d-66f4e314492f\") " Apr 16 22:17:09.979771 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979718 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-service-ca" (OuterVolumeSpecName: "service-ca") pod "c79d225c-5a16-4137-b10d-66f4e314492f" (UID: "c79d225c-5a16-4137-b10d-66f4e314492f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:09.979866 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-console-config" (OuterVolumeSpecName: "console-config") pod "c79d225c-5a16-4137-b10d-66f4e314492f" (UID: "c79d225c-5a16-4137-b10d-66f4e314492f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:09.979921 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c79d225c-5a16-4137-b10d-66f4e314492f" (UID: "c79d225c-5a16-4137-b10d-66f4e314492f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:09.979962 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.979941 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c79d225c-5a16-4137-b10d-66f4e314492f" (UID: "c79d225c-5a16-4137-b10d-66f4e314492f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:09.981689 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.981659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c79d225c-5a16-4137-b10d-66f4e314492f" (UID: "c79d225c-5a16-4137-b10d-66f4e314492f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:09.981860 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.981833 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c79d225c-5a16-4137-b10d-66f4e314492f" (UID: "c79d225c-5a16-4137-b10d-66f4e314492f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:09.981950 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:09.981931 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79d225c-5a16-4137-b10d-66f4e314492f-kube-api-access-6cbsr" (OuterVolumeSpecName: "kube-api-access-6cbsr") pod "c79d225c-5a16-4137-b10d-66f4e314492f" (UID: "c79d225c-5a16-4137-b10d-66f4e314492f"). InnerVolumeSpecName "kube-api-access-6cbsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:10.080290 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.080252 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-serving-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.080290 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.080289 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-console-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.080503 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.080298 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-trusted-ca-bundle\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.080503 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.080307 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c79d225c-5a16-4137-b10d-66f4e314492f-console-oauth-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.080503 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.080316 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-oauth-serving-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.080503 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.080345 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cbsr\" (UniqueName: \"kubernetes.io/projected/c79d225c-5a16-4137-b10d-66f4e314492f-kube-api-access-6cbsr\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.080503 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.080354 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c79d225c-5a16-4137-b10d-66f4e314492f-service-ca\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.185062 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.185035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77fd5bfc6c-c6v9b_c79d225c-5a16-4137-b10d-66f4e314492f/console/0.log" Apr 16 22:17:10.185217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.185070 2576 generic.go:358] "Generic (PLEG): container finished" podID="c79d225c-5a16-4137-b10d-66f4e314492f" containerID="400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536" exitCode=2 Apr 16 22:17:10.185217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.185107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fd5bfc6c-c6v9b" event={"ID":"c79d225c-5a16-4137-b10d-66f4e314492f","Type":"ContainerDied","Data":"400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536"} Apr 16 22:17:10.185217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.185131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fd5bfc6c-c6v9b" event={"ID":"c79d225c-5a16-4137-b10d-66f4e314492f","Type":"ContainerDied","Data":"4448c56803fd44a6d9f04ab2086644af68f426ace56c698947de9566b91b0c37"} Apr 16 22:17:10.185217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.185134 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fd5bfc6c-c6v9b" Apr 16 22:17:10.185217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.185152 2576 scope.go:117] "RemoveContainer" containerID="400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536" Apr 16 22:17:10.195536 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.195512 2576 scope.go:117] "RemoveContainer" containerID="400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536" Apr 16 22:17:10.195793 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:17:10.195774 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536\": container with ID starting with 400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536 not found: ID does not exist" containerID="400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536" Apr 16 22:17:10.195859 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.195801 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536"} err="failed to get container status \"400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536\": rpc error: code = NotFound desc = could not find container \"400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536\": container with ID starting with 400a4e5d85fea8eb1ef4834495c134878907d329bffa4353428eea37600d6536 not found: ID does not exist" Apr 16 22:17:10.211114 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.211084 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77fd5bfc6c-c6v9b"] Apr 16 22:17:10.214660 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:10.214641 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77fd5bfc6c-c6v9b"] Apr 16 22:17:11.451956 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:11.451927 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79d225c-5a16-4137-b10d-66f4e314492f" path="/var/lib/kubelet/pods/c79d225c-5a16-4137-b10d-66f4e314492f/volumes" Apr 16 22:17:52.191915 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:52.191877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:17:52.194282 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:52.194256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3eaddc5-e6c1-45aa-a952-0c7d74359e05-metrics-certs\") pod \"network-metrics-daemon-6dklm\" (UID: \"c3eaddc5-e6c1-45aa-a952-0c7d74359e05\") " pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:17:52.252378 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:52.252317 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gj5qm\"" Apr 16 22:17:52.260233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:52.260212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dklm" Apr 16 22:17:52.377203 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:52.377171 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6dklm"] Apr 16 22:17:52.380362 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:17:52.380332 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3eaddc5_e6c1_45aa_a952_0c7d74359e05.slice/crio-8431e301ab0e57e50bf53b448fc2d7e06b1d2ffad1db93316fe2f94cda4d999e WatchSource:0}: Error finding container 8431e301ab0e57e50bf53b448fc2d7e06b1d2ffad1db93316fe2f94cda4d999e: Status 404 returned error can't find the container with id 8431e301ab0e57e50bf53b448fc2d7e06b1d2ffad1db93316fe2f94cda4d999e Apr 16 22:17:53.302618 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:53.302581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6dklm" event={"ID":"c3eaddc5-e6c1-45aa-a952-0c7d74359e05","Type":"ContainerStarted","Data":"8431e301ab0e57e50bf53b448fc2d7e06b1d2ffad1db93316fe2f94cda4d999e"} Apr 16 22:17:54.306863 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:54.306826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6dklm" event={"ID":"c3eaddc5-e6c1-45aa-a952-0c7d74359e05","Type":"ContainerStarted","Data":"6d40ff289503721debab7cc0a18009cd3ec4f601de39ece92f5f81f0b2c380cb"} Apr 16 22:17:54.306863 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:54.306864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6dklm" event={"ID":"c3eaddc5-e6c1-45aa-a952-0c7d74359e05","Type":"ContainerStarted","Data":"49f3b1a94b13f4ebf6859520ff7950654b0c821c62adbf0bf39e6ed7cce1d1bd"} Apr 16 22:17:54.323179 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:17:54.323133 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6dklm" podStartSLOduration=252.396861852 podStartE2EDuration="4m13.323118602s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:17:52.382041669 +0000 UTC m=+251.506665933" lastFinishedPulling="2026-04-16 22:17:53.308298402 +0000 UTC m=+252.432922683" observedRunningTime="2026-04-16 22:17:54.3222975 +0000 UTC m=+253.446921799" watchObservedRunningTime="2026-04-16 22:17:54.323118602 +0000 UTC m=+253.447742889" Apr 16 22:18:07.839172 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.839095 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84c7fcd89f-g764g"] Apr 16 22:18:07.839519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.839386 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e51ed12b-ab5f-4580-b228-238d2b5b2534" containerName="registry" Apr 16 22:18:07.839519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.839399 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51ed12b-ab5f-4580-b228-238d2b5b2534" containerName="registry" Apr 16 22:18:07.839519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.839422 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c79d225c-5a16-4137-b10d-66f4e314492f" containerName="console" Apr 16 22:18:07.839519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.839428 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79d225c-5a16-4137-b10d-66f4e314492f" containerName="console" Apr 16 22:18:07.839519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.839470 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e51ed12b-ab5f-4580-b228-238d2b5b2534" containerName="registry" Apr 16 22:18:07.839519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.839480 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c79d225c-5a16-4137-b10d-66f4e314492f" containerName="console" Apr 16 22:18:07.843315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.843296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:07.850653 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.850633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:18:07.850947 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.850926 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:18:07.851879 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.851852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:18:07.851879 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.851860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:18:07.851879 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.851877 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-v9r8c\"" Apr 16 22:18:07.852098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.851859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:18:07.852098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.851854 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:18:07.852098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.851861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:18:07.856410 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.856387 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:18:07.874405 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.874383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84c7fcd89f-g764g"] Apr 16 22:18:07.912851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.912827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-service-ca\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:07.912968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.912855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-oauth-serving-cert\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:07.912968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.912882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-console-config\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:07.912968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.912942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-trusted-ca-bundle\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:07.913070 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.913010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-serving-cert\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:07.913070 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.913033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsqn9\" (UniqueName: \"kubernetes.io/projected/b28f0459-a79b-40d0-ac6f-949df9829684-kube-api-access-qsqn9\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:07.913133 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:07.913073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-oauth-config\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014095 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-serving-cert\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsqn9\" (UniqueName: \"kubernetes.io/projected/b28f0459-a79b-40d0-ac6f-949df9829684-kube-api-access-qsqn9\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-oauth-config\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-service-ca\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014450 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-oauth-serving-cert\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014450 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-console-config\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014450 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-trusted-ca-bundle\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.014916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-service-ca\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.015027 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.014988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-oauth-serving-cert\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.015027 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.015016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-console-config\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.015392 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.015372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-trusted-ca-bundle\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.016622 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.016586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-serving-cert\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.016810 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.016792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-oauth-config\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.022837 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.022815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsqn9\" (UniqueName: \"kubernetes.io/projected/b28f0459-a79b-40d0-ac6f-949df9829684-kube-api-access-qsqn9\") pod \"console-84c7fcd89f-g764g\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.152676 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.152612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:08.269273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.269113 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84c7fcd89f-g764g"] Apr 16 22:18:08.271727 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:18:08.271690 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb28f0459_a79b_40d0_ac6f_949df9829684.slice/crio-a1cacd11a257b0eb614c9e18a6a3891cb72e46fd525c825ee97095de5ff4a0ec WatchSource:0}: Error finding container a1cacd11a257b0eb614c9e18a6a3891cb72e46fd525c825ee97095de5ff4a0ec: Status 404 returned error can't find the container with id a1cacd11a257b0eb614c9e18a6a3891cb72e46fd525c825ee97095de5ff4a0ec Apr 16 22:18:08.344679 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.344643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c7fcd89f-g764g" event={"ID":"b28f0459-a79b-40d0-ac6f-949df9829684","Type":"ContainerStarted","Data":"af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679"} Apr 16 22:18:08.344679 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.344679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c7fcd89f-g764g" event={"ID":"b28f0459-a79b-40d0-ac6f-949df9829684","Type":"ContainerStarted","Data":"a1cacd11a257b0eb614c9e18a6a3891cb72e46fd525c825ee97095de5ff4a0ec"} Apr 16 22:18:08.362061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:08.361905 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84c7fcd89f-g764g" podStartSLOduration=1.361890968 podStartE2EDuration="1.361890968s" podCreationTimestamp="2026-04-16 22:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:08.36034371 +0000 UTC m=+267.484967989" watchObservedRunningTime="2026-04-16 22:18:08.361890968 +0000 UTC m=+267.486515256" Apr 16 22:18:17.418097 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.418058 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84c7fcd89f-g764g"] Apr 16 22:18:17.453071 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.453044 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67b4f68575-44mc5"] Apr 16 22:18:17.456395 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.456377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.463755 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.463730 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67b4f68575-44mc5"] Apr 16 22:18:17.580644 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.580611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-service-ca\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.580825 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.580654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-oauth-config\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.580825 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.580685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-config\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.580825 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.580746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6mjg\" (UniqueName: \"kubernetes.io/projected/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-kube-api-access-x6mjg\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.580825 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.580783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-trusted-ca-bundle\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.580975 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.580846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-oauth-serving-cert\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.580975 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.580880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-serving-cert\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.681466 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.681386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-serving-cert\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.681466 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.681429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-service-ca\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.681648 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.681492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-oauth-config\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.681648 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.681529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-config\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.681648 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.681569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6mjg\" (UniqueName: \"kubernetes.io/projected/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-kube-api-access-x6mjg\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.681648 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.681603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-trusted-ca-bundle\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.681648 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.681633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-oauth-serving-cert\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.682194 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.682165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-service-ca\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.682352 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.682203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-config\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.682352 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.682282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-oauth-serving-cert\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.682930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.682907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-trusted-ca-bundle\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.684069 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.684037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-oauth-config\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.684197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.684151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-serving-cert\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.689361 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.689340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6mjg\" (UniqueName: \"kubernetes.io/projected/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-kube-api-access-x6mjg\") pod \"console-67b4f68575-44mc5\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.766719 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.766687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:17.881521 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:17.881490 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67b4f68575-44mc5"] Apr 16 22:18:17.884398 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:18:17.884346 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod984d97bd_2515_4b90_9eb2_cad3c0ac81f0.slice/crio-3263d199fc4a609a19ddaa043fb9a6436f3b7f3bf697c91e84b9d7db0604d1fb WatchSource:0}: Error finding container 3263d199fc4a609a19ddaa043fb9a6436f3b7f3bf697c91e84b9d7db0604d1fb: Status 404 returned error can't find the container with id 3263d199fc4a609a19ddaa043fb9a6436f3b7f3bf697c91e84b9d7db0604d1fb Apr 16 22:18:18.152841 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:18.152795 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:18.373266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:18.373183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67b4f68575-44mc5" event={"ID":"984d97bd-2515-4b90-9eb2-cad3c0ac81f0","Type":"ContainerStarted","Data":"169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7"} Apr 16 22:18:18.373266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:18.373220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67b4f68575-44mc5" event={"ID":"984d97bd-2515-4b90-9eb2-cad3c0ac81f0","Type":"ContainerStarted","Data":"3263d199fc4a609a19ddaa043fb9a6436f3b7f3bf697c91e84b9d7db0604d1fb"} Apr 16 22:18:18.391341 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:18.391276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67b4f68575-44mc5" podStartSLOduration=1.3912633269999999 podStartE2EDuration="1.391263327s" podCreationTimestamp="2026-04-16 22:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:18.390726087 +0000 UTC m=+277.515350373" watchObservedRunningTime="2026-04-16 22:18:18.391263327 +0000 UTC m=+277.515887614" Apr 16 22:18:27.767491 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:27.767458 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:27.767491 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:27.767492 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:27.772271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:27.772245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:28.404973 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:28.404947 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:18:41.333285 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:41.333255 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:42.436418 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.436377 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84c7fcd89f-g764g" podUID="b28f0459-a79b-40d0-ac6f-949df9829684" containerName="console" containerID="cri-o://af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679" gracePeriod=15 Apr 16 22:18:42.688819 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.688797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84c7fcd89f-g764g_b28f0459-a79b-40d0-ac6f-949df9829684/console/0.log" Apr 16 22:18:42.688935 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.688854 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:42.769530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.769500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-oauth-config\") pod \"b28f0459-a79b-40d0-ac6f-949df9829684\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " Apr 16 22:18:42.769672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.769550 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-trusted-ca-bundle\") pod \"b28f0459-a79b-40d0-ac6f-949df9829684\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " Apr 16 22:18:42.769672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.769580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-service-ca\") pod \"b28f0459-a79b-40d0-ac6f-949df9829684\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " Apr 16 22:18:42.769672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.769617 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-oauth-serving-cert\") pod \"b28f0459-a79b-40d0-ac6f-949df9829684\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " Apr 16 22:18:42.769672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.769641 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsqn9\" (UniqueName: \"kubernetes.io/projected/b28f0459-a79b-40d0-ac6f-949df9829684-kube-api-access-qsqn9\") pod \"b28f0459-a79b-40d0-ac6f-949df9829684\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " Apr 16 22:18:42.769672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.769669 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-console-config\") pod \"b28f0459-a79b-40d0-ac6f-949df9829684\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " Apr 16 22:18:42.769922 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.769695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-serving-cert\") pod \"b28f0459-a79b-40d0-ac6f-949df9829684\" (UID: \"b28f0459-a79b-40d0-ac6f-949df9829684\") " Apr 16 22:18:42.770059 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.770035 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-service-ca" (OuterVolumeSpecName: "service-ca") pod "b28f0459-a79b-40d0-ac6f-949df9829684" (UID: "b28f0459-a79b-40d0-ac6f-949df9829684"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:42.770121 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.770068 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-console-config" (OuterVolumeSpecName: "console-config") pod "b28f0459-a79b-40d0-ac6f-949df9829684" (UID: "b28f0459-a79b-40d0-ac6f-949df9829684"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:42.770158 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.770107 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b28f0459-a79b-40d0-ac6f-949df9829684" (UID: "b28f0459-a79b-40d0-ac6f-949df9829684"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:42.770158 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.770114 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b28f0459-a79b-40d0-ac6f-949df9829684" (UID: "b28f0459-a79b-40d0-ac6f-949df9829684"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:42.771638 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.771610 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b28f0459-a79b-40d0-ac6f-949df9829684" (UID: "b28f0459-a79b-40d0-ac6f-949df9829684"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:42.771752 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.771731 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28f0459-a79b-40d0-ac6f-949df9829684-kube-api-access-qsqn9" (OuterVolumeSpecName: "kube-api-access-qsqn9") pod "b28f0459-a79b-40d0-ac6f-949df9829684" (UID: "b28f0459-a79b-40d0-ac6f-949df9829684"). InnerVolumeSpecName "kube-api-access-qsqn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:18:42.771798 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.771759 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b28f0459-a79b-40d0-ac6f-949df9829684" (UID: "b28f0459-a79b-40d0-ac6f-949df9829684"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:42.870361 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.870305 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-trusted-ca-bundle\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:18:42.870361 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.870357 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-service-ca\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:18:42.870361 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.870368 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-oauth-serving-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:18:42.870551 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.870377 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qsqn9\" (UniqueName: \"kubernetes.io/projected/b28f0459-a79b-40d0-ac6f-949df9829684-kube-api-access-qsqn9\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:18:42.870551 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.870388 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b28f0459-a79b-40d0-ac6f-949df9829684-console-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:18:42.870551 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.870396 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-serving-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:18:42.870551 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:42.870404 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b28f0459-a79b-40d0-ac6f-949df9829684-console-oauth-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:18:43.439770 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.439743 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84c7fcd89f-g764g_b28f0459-a79b-40d0-ac6f-949df9829684/console/0.log" Apr 16 22:18:43.440173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.439787 2576 generic.go:358] "Generic (PLEG): container finished" podID="b28f0459-a79b-40d0-ac6f-949df9829684" containerID="af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679" exitCode=2 Apr 16 22:18:43.440173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.439862 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c7fcd89f-g764g" Apr 16 22:18:43.440173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.439882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c7fcd89f-g764g" event={"ID":"b28f0459-a79b-40d0-ac6f-949df9829684","Type":"ContainerDied","Data":"af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679"} Apr 16 22:18:43.440173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.439931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c7fcd89f-g764g" event={"ID":"b28f0459-a79b-40d0-ac6f-949df9829684","Type":"ContainerDied","Data":"a1cacd11a257b0eb614c9e18a6a3891cb72e46fd525c825ee97095de5ff4a0ec"} Apr 16 22:18:43.440173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.439949 2576 scope.go:117] "RemoveContainer" containerID="af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679" Apr 16 22:18:43.448188 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.448168 2576 scope.go:117] "RemoveContainer" containerID="af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679" Apr 16 22:18:43.448450 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:18:43.448427 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679\": container with ID starting with af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679 not found: ID does not exist" containerID="af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679" Apr 16 22:18:43.448508 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.448457 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679"} err="failed to get container status \"af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679\": rpc error: code = NotFound desc = could not find container \"af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679\": container with ID starting with af5a76d554b215d63c50e605bca96d93b60c1cf157ba8bacb2d3292b95913679 not found: ID does not exist" Apr 16 22:18:43.460031 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.460007 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84c7fcd89f-g764g"] Apr 16 22:18:43.466104 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:43.466082 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84c7fcd89f-g764g"] Apr 16 22:18:45.452255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:18:45.452223 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28f0459-a79b-40d0-ac6f-949df9829684" path="/var/lib/kubelet/pods/b28f0459-a79b-40d0-ac6f-949df9829684/volumes" Apr 16 22:19:18.868984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.868949 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z"] Apr 16 22:19:18.869519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.869317 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b28f0459-a79b-40d0-ac6f-949df9829684" containerName="console" Apr 16 22:19:18.869519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.869352 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28f0459-a79b-40d0-ac6f-949df9829684" containerName="console" Apr 16 22:19:18.869519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.869429 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b28f0459-a79b-40d0-ac6f-949df9829684" containerName="console" Apr 16 22:19:18.872318 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.872297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:18.875235 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.875214 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 22:19:18.875414 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.875396 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 22:19:18.875528 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.875508 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-tl29n\"" Apr 16 22:19:18.875602 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.875517 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 22:19:18.886350 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:18.886306 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z"] Apr 16 22:19:19.038182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.038149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2f0d04b6-1f1d-496a-9310-1e93fe3579d8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7695z\" (UID: \"2f0d04b6-1f1d-496a-9310-1e93fe3579d8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:19.038366 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.038195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl9p5\" (UniqueName: \"kubernetes.io/projected/2f0d04b6-1f1d-496a-9310-1e93fe3579d8-kube-api-access-xl9p5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7695z\" (UID: \"2f0d04b6-1f1d-496a-9310-1e93fe3579d8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:19.139397 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.139286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl9p5\" (UniqueName: \"kubernetes.io/projected/2f0d04b6-1f1d-496a-9310-1e93fe3579d8-kube-api-access-xl9p5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7695z\" (UID: \"2f0d04b6-1f1d-496a-9310-1e93fe3579d8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:19.139397 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.139377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2f0d04b6-1f1d-496a-9310-1e93fe3579d8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7695z\" (UID: \"2f0d04b6-1f1d-496a-9310-1e93fe3579d8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:19.141636 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.141607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2f0d04b6-1f1d-496a-9310-1e93fe3579d8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7695z\" (UID: \"2f0d04b6-1f1d-496a-9310-1e93fe3579d8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:19.147381 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.147355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl9p5\" (UniqueName: \"kubernetes.io/projected/2f0d04b6-1f1d-496a-9310-1e93fe3579d8-kube-api-access-xl9p5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7695z\" (UID: \"2f0d04b6-1f1d-496a-9310-1e93fe3579d8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:19.181758 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.181732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:19.300573 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.300444 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z"] Apr 16 22:19:19.303357 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:19:19.303312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f0d04b6_1f1d_496a_9310_1e93fe3579d8.slice/crio-a1f1d89ec26a8e73fd8345a0dd35e277f7e1a33335fdf7a609ebc7a379eff3f3 WatchSource:0}: Error finding container a1f1d89ec26a8e73fd8345a0dd35e277f7e1a33335fdf7a609ebc7a379eff3f3: Status 404 returned error can't find the container with id a1f1d89ec26a8e73fd8345a0dd35e277f7e1a33335fdf7a609ebc7a379eff3f3 Apr 16 22:19:19.304995 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.304979 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:19:19.535094 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:19.535058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" event={"ID":"2f0d04b6-1f1d-496a-9310-1e93fe3579d8","Type":"ContainerStarted","Data":"a1f1d89ec26a8e73fd8345a0dd35e277f7e1a33335fdf7a609ebc7a379eff3f3"} Apr 16 22:19:23.108536 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.108498 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5sw96"] Apr 16 22:19:23.111690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.111665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.114173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.114152 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 22:19:23.114491 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.114472 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 22:19:23.114571 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.114555 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rkcdv\"" Apr 16 22:19:23.121682 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.121658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5sw96"] Apr 16 22:19:23.273698 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.273660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm2qk\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-kube-api-access-wm2qk\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.273946 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.273715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-cabundle0\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.273946 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.273740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.374581 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.374508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm2qk\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-kube-api-access-wm2qk\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.374581 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.374550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-cabundle0\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.374581 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.374573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.374789 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.374697 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:19:23.374789 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.374713 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:19:23.374789 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.374724 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5sw96: references non-existent secret key: ca.crt Apr 16 22:19:23.374789 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.374789 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates podName:bf05b65a-d89f-4ee1-bbf3-df385bb355aa nodeName:}" failed. No retries permitted until 2026-04-16 22:19:23.874765665 +0000 UTC m=+342.999389931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates") pod "keda-operator-ffbb595cb-5sw96" (UID: "bf05b65a-d89f-4ee1-bbf3-df385bb355aa") : references non-existent secret key: ca.crt Apr 16 22:19:23.375224 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.375208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-cabundle0\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.394232 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.394199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm2qk\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-kube-api-access-wm2qk\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.548033 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.547992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" event={"ID":"2f0d04b6-1f1d-496a-9310-1e93fe3579d8","Type":"ContainerStarted","Data":"d8e9d44a185ccc39f2f04eeb11da0654c8b1b8110a52d930fc505a3ae26f247f"} Apr 16 22:19:23.548224 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.548063 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:23.569319 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.569270 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" podStartSLOduration=2.30638671 podStartE2EDuration="5.569248765s" podCreationTimestamp="2026-04-16 22:19:18 +0000 UTC" firstStartedPulling="2026-04-16 22:19:19.305097948 +0000 UTC m=+338.429722212" lastFinishedPulling="2026-04-16 22:19:22.567959988 +0000 UTC m=+341.692584267" observedRunningTime="2026-04-16 22:19:23.56792438 +0000 UTC m=+342.692548668" watchObservedRunningTime="2026-04-16 22:19:23.569248765 +0000 UTC m=+342.693873053" Apr 16 22:19:23.879064 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:23.877885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:23.879064 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.878056 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:19:23.879064 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.878081 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:19:23.879064 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.878093 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5sw96: references non-existent secret key: ca.crt Apr 16 22:19:23.879064 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:19:23.878153 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates podName:bf05b65a-d89f-4ee1-bbf3-df385bb355aa nodeName:}" failed. No retries permitted until 2026-04-16 22:19:24.878135859 +0000 UTC m=+344.002760137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates") pod "keda-operator-ffbb595cb-5sw96" (UID: "bf05b65a-d89f-4ee1-bbf3-df385bb355aa") : references non-existent secret key: ca.crt Apr 16 22:19:24.886282 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:24.886242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:24.888667 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:24.888646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf05b65a-d89f-4ee1-bbf3-df385bb355aa-certificates\") pod \"keda-operator-ffbb595cb-5sw96\" (UID: \"bf05b65a-d89f-4ee1-bbf3-df385bb355aa\") " pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:24.921666 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:24.921625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:25.041490 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:25.041358 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5sw96"] Apr 16 22:19:25.044293 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:19:25.044264 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf05b65a_d89f_4ee1_bbf3_df385bb355aa.slice/crio-9e68de34c2c94b349c30c8f1f2684f8307284d02e5ee504bb29a0ac29eb3d0b4 WatchSource:0}: Error finding container 9e68de34c2c94b349c30c8f1f2684f8307284d02e5ee504bb29a0ac29eb3d0b4: Status 404 returned error can't find the container with id 9e68de34c2c94b349c30c8f1f2684f8307284d02e5ee504bb29a0ac29eb3d0b4 Apr 16 22:19:25.555372 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:25.555317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5sw96" event={"ID":"bf05b65a-d89f-4ee1-bbf3-df385bb355aa","Type":"ContainerStarted","Data":"9e68de34c2c94b349c30c8f1f2684f8307284d02e5ee504bb29a0ac29eb3d0b4"} Apr 16 22:19:28.566297 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:28.566262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5sw96" event={"ID":"bf05b65a-d89f-4ee1-bbf3-df385bb355aa","Type":"ContainerStarted","Data":"ca2d52cbc1340b6ba9397fed692b65cb76d01d457c3208c520192c7f193531fc"} Apr 16 22:19:28.566690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:28.566375 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:19:28.582385 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:28.582318 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-5sw96" podStartSLOduration=2.5044548410000003 podStartE2EDuration="5.582305514s" podCreationTimestamp="2026-04-16 22:19:23 +0000 UTC" firstStartedPulling="2026-04-16 22:19:25.045885702 +0000 UTC m=+344.170509967" lastFinishedPulling="2026-04-16 22:19:28.123736375 +0000 UTC m=+347.248360640" observedRunningTime="2026-04-16 22:19:28.581479315 +0000 UTC m=+347.706103614" watchObservedRunningTime="2026-04-16 22:19:28.582305514 +0000 UTC m=+347.706929800" Apr 16 22:19:44.554125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:44.554084 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7695z" Apr 16 22:19:49.570883 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:19:49.570851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-5sw96" Apr 16 22:20:32.393465 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.393431 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-ppfds"] Apr 16 22:20:32.396607 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.396587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:32.399360 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.399312 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:20:32.399530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.399424 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:20:32.399530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.399451 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 22:20:32.400604 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.400587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-jjvqm\"" Apr 16 22:20:32.408850 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.408828 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-ppfds"] Apr 16 22:20:32.431261 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.431234 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-wnmw4"] Apr 16 22:20:32.434212 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.434197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.437200 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.437180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-btjqh\"" Apr 16 22:20:32.437427 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.437411 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 22:20:32.444554 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.444534 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-wnmw4"] Apr 16 22:20:32.514042 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.514001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv7hw\" (UniqueName: \"kubernetes.io/projected/bdc26ee9-46e8-464a-9073-1b8f9ddef561-kube-api-access-gv7hw\") pod \"kserve-controller-manager-84d7d5cfc6-ppfds\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:32.514219 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.514052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert\") pod \"kserve-controller-manager-84d7d5cfc6-ppfds\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:32.614930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.614898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/521f722d-8bc2-4b4d-849a-9eda46c2bcf4-data\") pod \"seaweedfs-86cc847c5c-wnmw4\" (UID: \"521f722d-8bc2-4b4d-849a-9eda46c2bcf4\") " pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.615106 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.614967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv7hw\" (UniqueName: \"kubernetes.io/projected/bdc26ee9-46e8-464a-9073-1b8f9ddef561-kube-api-access-gv7hw\") pod \"kserve-controller-manager-84d7d5cfc6-ppfds\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:32.615106 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.614995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert\") pod \"kserve-controller-manager-84d7d5cfc6-ppfds\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:32.615106 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.615027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbgj\" (UniqueName: \"kubernetes.io/projected/521f722d-8bc2-4b4d-849a-9eda46c2bcf4-kube-api-access-5zbgj\") pod \"seaweedfs-86cc847c5c-wnmw4\" (UID: \"521f722d-8bc2-4b4d-849a-9eda46c2bcf4\") " pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.615223 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:20:32.615123 2576 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 22:20:32.615223 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:20:32.615180 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert podName:bdc26ee9-46e8-464a-9073-1b8f9ddef561 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:33.115162822 +0000 UTC m=+412.239787089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert") pod "kserve-controller-manager-84d7d5cfc6-ppfds" (UID: "bdc26ee9-46e8-464a-9073-1b8f9ddef561") : secret "kserve-webhook-server-cert" not found Apr 16 22:20:32.623605 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.623576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv7hw\" (UniqueName: \"kubernetes.io/projected/bdc26ee9-46e8-464a-9073-1b8f9ddef561-kube-api-access-gv7hw\") pod \"kserve-controller-manager-84d7d5cfc6-ppfds\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:32.716422 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.716387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbgj\" (UniqueName: \"kubernetes.io/projected/521f722d-8bc2-4b4d-849a-9eda46c2bcf4-kube-api-access-5zbgj\") pod \"seaweedfs-86cc847c5c-wnmw4\" (UID: \"521f722d-8bc2-4b4d-849a-9eda46c2bcf4\") " pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.716606 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.716470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/521f722d-8bc2-4b4d-849a-9eda46c2bcf4-data\") pod \"seaweedfs-86cc847c5c-wnmw4\" (UID: \"521f722d-8bc2-4b4d-849a-9eda46c2bcf4\") " pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.716786 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.716771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/521f722d-8bc2-4b4d-849a-9eda46c2bcf4-data\") pod \"seaweedfs-86cc847c5c-wnmw4\" (UID: \"521f722d-8bc2-4b4d-849a-9eda46c2bcf4\") " pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.726222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.726191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbgj\" (UniqueName: \"kubernetes.io/projected/521f722d-8bc2-4b4d-849a-9eda46c2bcf4-kube-api-access-5zbgj\") pod \"seaweedfs-86cc847c5c-wnmw4\" (UID: \"521f722d-8bc2-4b4d-849a-9eda46c2bcf4\") " pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.742913 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.742887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:32.860023 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:32.859998 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-wnmw4"] Apr 16 22:20:32.862236 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:20:32.862204 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod521f722d_8bc2_4b4d_849a_9eda46c2bcf4.slice/crio-1662e6337aa3e713533f278b630831576ad24ddc7fa17b8d8ee9a162e04cfead WatchSource:0}: Error finding container 1662e6337aa3e713533f278b630831576ad24ddc7fa17b8d8ee9a162e04cfead: Status 404 returned error can't find the container with id 1662e6337aa3e713533f278b630831576ad24ddc7fa17b8d8ee9a162e04cfead Apr 16 22:20:33.119564 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:33.119475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert\") pod \"kserve-controller-manager-84d7d5cfc6-ppfds\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:33.121823 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:33.121803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert\") pod \"kserve-controller-manager-84d7d5cfc6-ppfds\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:33.307189 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:33.307148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:33.531794 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:33.531765 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-ppfds"] Apr 16 22:20:33.535022 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:20:33.534985 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc26ee9_46e8_464a_9073_1b8f9ddef561.slice/crio-10a3fb7ba634b81f27a089f76edacad3bc5452fc6048b47d60bfcf30b6a13d11 WatchSource:0}: Error finding container 10a3fb7ba634b81f27a089f76edacad3bc5452fc6048b47d60bfcf30b6a13d11: Status 404 returned error can't find the container with id 10a3fb7ba634b81f27a089f76edacad3bc5452fc6048b47d60bfcf30b6a13d11 Apr 16 22:20:33.743747 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:33.743706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-wnmw4" event={"ID":"521f722d-8bc2-4b4d-849a-9eda46c2bcf4","Type":"ContainerStarted","Data":"1662e6337aa3e713533f278b630831576ad24ddc7fa17b8d8ee9a162e04cfead"} Apr 16 22:20:33.744954 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:33.744923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" event={"ID":"bdc26ee9-46e8-464a-9073-1b8f9ddef561","Type":"ContainerStarted","Data":"10a3fb7ba634b81f27a089f76edacad3bc5452fc6048b47d60bfcf30b6a13d11"} Apr 16 22:20:37.757217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:37.757179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-wnmw4" event={"ID":"521f722d-8bc2-4b4d-849a-9eda46c2bcf4","Type":"ContainerStarted","Data":"383f374f3fa5ce8714e751ac499fe27c5c42ffa5faaa5dc19d839f610dc9b65e"} Apr 16 22:20:37.757652 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:37.757282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:20:37.758597 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:37.758573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" event={"ID":"bdc26ee9-46e8-464a-9073-1b8f9ddef561","Type":"ContainerStarted","Data":"1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e"} Apr 16 22:20:37.758737 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:37.758725 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:20:37.773362 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:37.773306 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-wnmw4" podStartSLOduration=1.789320412 podStartE2EDuration="5.7732917s" podCreationTimestamp="2026-04-16 22:20:32 +0000 UTC" firstStartedPulling="2026-04-16 22:20:32.863500249 +0000 UTC m=+411.988124514" lastFinishedPulling="2026-04-16 22:20:36.847471527 +0000 UTC m=+415.972095802" observedRunningTime="2026-04-16 22:20:37.771938224 +0000 UTC m=+416.896562514" watchObservedRunningTime="2026-04-16 22:20:37.7732917 +0000 UTC m=+416.897915988" Apr 16 22:20:37.787281 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:37.787240 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" podStartSLOduration=2.529404162 podStartE2EDuration="5.787227564s" podCreationTimestamp="2026-04-16 22:20:32 +0000 UTC" firstStartedPulling="2026-04-16 22:20:33.536620225 +0000 UTC m=+412.661244490" lastFinishedPulling="2026-04-16 22:20:36.794443622 +0000 UTC m=+415.919067892" observedRunningTime="2026-04-16 22:20:37.78576623 +0000 UTC m=+416.910390517" watchObservedRunningTime="2026-04-16 22:20:37.787227564 +0000 UTC m=+416.911851851" Apr 16 22:20:43.763861 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:20:43.763826 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-wnmw4" Apr 16 22:21:07.325572 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.325492 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-ppfds"] Apr 16 22:21:07.326105 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.325748 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" podUID="bdc26ee9-46e8-464a-9073-1b8f9ddef561" containerName="manager" containerID="cri-o://1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e" gracePeriod=10 Apr 16 22:21:07.330210 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.330187 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:21:07.351754 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.351729 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-2qdg6"] Apr 16 22:21:07.355224 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.355208 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.360923 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.360900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-2qdg6"] Apr 16 22:21:07.484345 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.484298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0231-b7e5-418f-815e-42e8c0618990-cert\") pod \"kserve-controller-manager-84d7d5cfc6-2qdg6\" (UID: \"375b0231-b7e5-418f-815e-42e8c0618990\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.484517 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.484448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrkt\" (UniqueName: \"kubernetes.io/projected/375b0231-b7e5-418f-815e-42e8c0618990-kube-api-access-rvrkt\") pod \"kserve-controller-manager-84d7d5cfc6-2qdg6\" (UID: \"375b0231-b7e5-418f-815e-42e8c0618990\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.550115 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.550095 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:21:07.585456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.585388 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv7hw\" (UniqueName: \"kubernetes.io/projected/bdc26ee9-46e8-464a-9073-1b8f9ddef561-kube-api-access-gv7hw\") pod \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " Apr 16 22:21:07.585580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.585473 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert\") pod \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\" (UID: \"bdc26ee9-46e8-464a-9073-1b8f9ddef561\") " Apr 16 22:21:07.585627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.585580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrkt\" (UniqueName: \"kubernetes.io/projected/375b0231-b7e5-418f-815e-42e8c0618990-kube-api-access-rvrkt\") pod \"kserve-controller-manager-84d7d5cfc6-2qdg6\" (UID: \"375b0231-b7e5-418f-815e-42e8c0618990\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.585674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.585628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0231-b7e5-418f-815e-42e8c0618990-cert\") pod \"kserve-controller-manager-84d7d5cfc6-2qdg6\" (UID: \"375b0231-b7e5-418f-815e-42e8c0618990\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.587603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.587571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc26ee9-46e8-464a-9073-1b8f9ddef561-kube-api-access-gv7hw" (OuterVolumeSpecName: "kube-api-access-gv7hw") pod "bdc26ee9-46e8-464a-9073-1b8f9ddef561" (UID: "bdc26ee9-46e8-464a-9073-1b8f9ddef561"). InnerVolumeSpecName "kube-api-access-gv7hw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:21:07.587603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.587588 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert" (OuterVolumeSpecName: "cert") pod "bdc26ee9-46e8-464a-9073-1b8f9ddef561" (UID: "bdc26ee9-46e8-464a-9073-1b8f9ddef561"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:21:07.587916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.587899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375b0231-b7e5-418f-815e-42e8c0618990-cert\") pod \"kserve-controller-manager-84d7d5cfc6-2qdg6\" (UID: \"375b0231-b7e5-418f-815e-42e8c0618990\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.593562 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.593538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrkt\" (UniqueName: \"kubernetes.io/projected/375b0231-b7e5-418f-815e-42e8c0618990-kube-api-access-rvrkt\") pod \"kserve-controller-manager-84d7d5cfc6-2qdg6\" (UID: \"375b0231-b7e5-418f-815e-42e8c0618990\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.686823 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.686793 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gv7hw\" (UniqueName: \"kubernetes.io/projected/bdc26ee9-46e8-464a-9073-1b8f9ddef561-kube-api-access-gv7hw\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:21:07.686823 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.686821 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdc26ee9-46e8-464a-9073-1b8f9ddef561-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:21:07.718740 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.718716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:07.836728 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.836653 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-2qdg6"] Apr 16 22:21:07.840080 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:21:07.840052 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375b0231_b7e5_418f_815e_42e8c0618990.slice/crio-4d4412b9fbde8019f6f5a2c9d269e49bcb9371fcd02b09f3752c2eb668929f14 WatchSource:0}: Error finding container 4d4412b9fbde8019f6f5a2c9d269e49bcb9371fcd02b09f3752c2eb668929f14: Status 404 returned error can't find the container with id 4d4412b9fbde8019f6f5a2c9d269e49bcb9371fcd02b09f3752c2eb668929f14 Apr 16 22:21:07.847158 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.847131 2576 generic.go:358] "Generic (PLEG): container finished" podID="bdc26ee9-46e8-464a-9073-1b8f9ddef561" containerID="1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e" exitCode=0 Apr 16 22:21:07.847298 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.847234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" event={"ID":"bdc26ee9-46e8-464a-9073-1b8f9ddef561","Type":"ContainerDied","Data":"1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e"} Apr 16 22:21:07.847298 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.847278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" event={"ID":"bdc26ee9-46e8-464a-9073-1b8f9ddef561","Type":"ContainerDied","Data":"10a3fb7ba634b81f27a089f76edacad3bc5452fc6048b47d60bfcf30b6a13d11"} Apr 16 22:21:07.847483 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.847302 2576 scope.go:117] "RemoveContainer" containerID="1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e" Apr 16 22:21:07.847593 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.847545 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-ppfds" Apr 16 22:21:07.850580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.850537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" event={"ID":"375b0231-b7e5-418f-815e-42e8c0618990","Type":"ContainerStarted","Data":"4d4412b9fbde8019f6f5a2c9d269e49bcb9371fcd02b09f3752c2eb668929f14"} Apr 16 22:21:07.857278 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.857230 2576 scope.go:117] "RemoveContainer" containerID="1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e" Apr 16 22:21:07.857548 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:21:07.857523 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e\": container with ID starting with 1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e not found: ID does not exist" containerID="1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e" Apr 16 22:21:07.857601 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.857564 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e"} err="failed to get container status \"1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e\": rpc error: code = NotFound desc = could not find container \"1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e\": container with ID starting with 1d6841f46f12ff738745530c86483ea17abcace68769fbee6f28095fd136717e not found: ID does not exist" Apr 16 22:21:07.869465 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.869443 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-ppfds"] Apr 16 22:21:07.873774 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:07.873751 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-ppfds"] Apr 16 22:21:08.855208 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:08.855170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" event={"ID":"375b0231-b7e5-418f-815e-42e8c0618990","Type":"ContainerStarted","Data":"b800087db28e1493088e78888ed2fcaf6f39a73d5d7f16b666ce00c32baaface"} Apr 16 22:21:08.855639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:08.855297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:08.872766 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:08.872722 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" podStartSLOduration=1.550625457 podStartE2EDuration="1.872709359s" podCreationTimestamp="2026-04-16 22:21:07 +0000 UTC" firstStartedPulling="2026-04-16 22:21:07.841305802 +0000 UTC m=+446.965930067" lastFinishedPulling="2026-04-16 22:21:08.163389703 +0000 UTC m=+447.288013969" observedRunningTime="2026-04-16 22:21:08.871908562 +0000 UTC m=+447.996532855" watchObservedRunningTime="2026-04-16 22:21:08.872709359 +0000 UTC m=+447.997333645" Apr 16 22:21:09.453202 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:09.453170 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc26ee9-46e8-464a-9073-1b8f9ddef561" path="/var/lib/kubelet/pods/bdc26ee9-46e8-464a-9073-1b8f9ddef561/volumes" Apr 16 22:21:39.863704 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:39.863675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-2qdg6" Apr 16 22:21:40.697312 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.697272 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-28qb8"] Apr 16 22:21:40.697611 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.697596 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdc26ee9-46e8-464a-9073-1b8f9ddef561" containerName="manager" Apr 16 22:21:40.697654 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.697613 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc26ee9-46e8-464a-9073-1b8f9ddef561" containerName="manager" Apr 16 22:21:40.697692 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.697664 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdc26ee9-46e8-464a-9073-1b8f9ddef561" containerName="manager" Apr 16 22:21:40.700488 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.700464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:40.702709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.702691 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 22:21:40.702796 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.702700 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-qt5bj\"" Apr 16 22:21:40.709255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.709232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-28qb8"] Apr 16 22:21:40.849712 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.849672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/518c2439-1b1d-45e9-9af6-e80e3c47182d-tls-certs\") pod \"model-serving-api-86f7b4b499-28qb8\" (UID: \"518c2439-1b1d-45e9-9af6-e80e3c47182d\") " pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:40.849712 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.849713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwtv6\" (UniqueName: \"kubernetes.io/projected/518c2439-1b1d-45e9-9af6-e80e3c47182d-kube-api-access-xwtv6\") pod \"model-serving-api-86f7b4b499-28qb8\" (UID: \"518c2439-1b1d-45e9-9af6-e80e3c47182d\") " pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:40.950530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.950447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/518c2439-1b1d-45e9-9af6-e80e3c47182d-tls-certs\") pod \"model-serving-api-86f7b4b499-28qb8\" (UID: \"518c2439-1b1d-45e9-9af6-e80e3c47182d\") " pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:40.950530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.950487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwtv6\" (UniqueName: \"kubernetes.io/projected/518c2439-1b1d-45e9-9af6-e80e3c47182d-kube-api-access-xwtv6\") pod \"model-serving-api-86f7b4b499-28qb8\" (UID: \"518c2439-1b1d-45e9-9af6-e80e3c47182d\") " pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:40.953015 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.952991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/518c2439-1b1d-45e9-9af6-e80e3c47182d-tls-certs\") pod \"model-serving-api-86f7b4b499-28qb8\" (UID: \"518c2439-1b1d-45e9-9af6-e80e3c47182d\") " pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:40.959127 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:40.959101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwtv6\" (UniqueName: \"kubernetes.io/projected/518c2439-1b1d-45e9-9af6-e80e3c47182d-kube-api-access-xwtv6\") pod \"model-serving-api-86f7b4b499-28qb8\" (UID: \"518c2439-1b1d-45e9-9af6-e80e3c47182d\") " pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:41.011077 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:41.011047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:41.134708 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:41.134677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-28qb8"] Apr 16 22:21:41.138088 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:21:41.138058 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod518c2439_1b1d_45e9_9af6_e80e3c47182d.slice/crio-66e80077a8cc8bab457f3dd79231b2539e5f1e3f4cbe17e7489d60af311399fd WatchSource:0}: Error finding container 66e80077a8cc8bab457f3dd79231b2539e5f1e3f4cbe17e7489d60af311399fd: Status 404 returned error can't find the container with id 66e80077a8cc8bab457f3dd79231b2539e5f1e3f4cbe17e7489d60af311399fd Apr 16 22:21:41.954702 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:41.954667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-28qb8" event={"ID":"518c2439-1b1d-45e9-9af6-e80e3c47182d","Type":"ContainerStarted","Data":"66e80077a8cc8bab457f3dd79231b2539e5f1e3f4cbe17e7489d60af311399fd"} Apr 16 22:21:42.958968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:42.958928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-28qb8" event={"ID":"518c2439-1b1d-45e9-9af6-e80e3c47182d","Type":"ContainerStarted","Data":"beb7d8314632bdebc7c5d3652a8f8c29d68aaaf5881b0c9b1c002c6e4ca556aa"} Apr 16 22:21:42.959388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:42.959053 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:21:42.989516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:42.989470 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-28qb8" podStartSLOduration=1.873191002 podStartE2EDuration="2.989455834s" podCreationTimestamp="2026-04-16 22:21:40 +0000 UTC" firstStartedPulling="2026-04-16 22:21:41.139802688 +0000 UTC m=+480.264426953" lastFinishedPulling="2026-04-16 22:21:42.256067517 +0000 UTC m=+481.380691785" observedRunningTime="2026-04-16 22:21:42.988383356 +0000 UTC m=+482.113007643" watchObservedRunningTime="2026-04-16 22:21:42.989455834 +0000 UTC m=+482.114080121" Apr 16 22:21:53.966631 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:21:53.966602 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-28qb8" Apr 16 22:22:07.072271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.072236 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl"] Apr 16 22:22:07.074495 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.074472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.076704 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.076682 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 22:22:07.081722 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.081699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl"] Apr 16 22:22:07.131993 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.131958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6hs\" (UniqueName: \"kubernetes.io/projected/7599663a-37ff-483e-9f03-8e3df5851bcb-kube-api-access-qt6hs\") pod \"seaweedfs-tls-custom-ddd4dbfd-t8ffl\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.131993 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.131995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7599663a-37ff-483e-9f03-8e3df5851bcb-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-t8ffl\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.233182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.233146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6hs\" (UniqueName: \"kubernetes.io/projected/7599663a-37ff-483e-9f03-8e3df5851bcb-kube-api-access-qt6hs\") pod \"seaweedfs-tls-custom-ddd4dbfd-t8ffl\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.233182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.233185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7599663a-37ff-483e-9f03-8e3df5851bcb-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-t8ffl\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.233556 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.233536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7599663a-37ff-483e-9f03-8e3df5851bcb-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-t8ffl\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.240900 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.240880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6hs\" (UniqueName: \"kubernetes.io/projected/7599663a-37ff-483e-9f03-8e3df5851bcb-kube-api-access-qt6hs\") pod \"seaweedfs-tls-custom-ddd4dbfd-t8ffl\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.384269 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.384179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:07.505588 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:07.505545 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl"] Apr 16 22:22:07.508434 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:22:07.508403 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7599663a_37ff_483e_9f03_8e3df5851bcb.slice/crio-11a2e02582007d882859562daa313166cf470c9105c3df16adf471d304e38488 WatchSource:0}: Error finding container 11a2e02582007d882859562daa313166cf470c9105c3df16adf471d304e38488: Status 404 returned error can't find the container with id 11a2e02582007d882859562daa313166cf470c9105c3df16adf471d304e38488 Apr 16 22:22:08.030039 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:08.029946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" event={"ID":"7599663a-37ff-483e-9f03-8e3df5851bcb","Type":"ContainerStarted","Data":"b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5"} Apr 16 22:22:08.030039 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:08.029985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" event={"ID":"7599663a-37ff-483e-9f03-8e3df5851bcb","Type":"ContainerStarted","Data":"11a2e02582007d882859562daa313166cf470c9105c3df16adf471d304e38488"} Apr 16 22:22:08.045034 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:08.044982 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" podStartSLOduration=0.809063718 podStartE2EDuration="1.044965903s" podCreationTimestamp="2026-04-16 22:22:07 +0000 UTC" firstStartedPulling="2026-04-16 22:22:07.50964826 +0000 UTC m=+506.634272526" lastFinishedPulling="2026-04-16 22:22:07.745550446 +0000 UTC m=+506.870174711" observedRunningTime="2026-04-16 22:22:08.04376382 +0000 UTC m=+507.168388111" watchObservedRunningTime="2026-04-16 22:22:08.044965903 +0000 UTC m=+507.169590190" Apr 16 22:22:09.502878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:09.502844 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl"] Apr 16 22:22:10.038577 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:10.038516 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" podUID="7599663a-37ff-483e-9f03-8e3df5851bcb" containerName="seaweedfs-tls-custom" containerID="cri-o://b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5" gracePeriod=30 Apr 16 22:22:38.476963 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:38.476934 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:38.568644 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:38.568556 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6hs\" (UniqueName: \"kubernetes.io/projected/7599663a-37ff-483e-9f03-8e3df5851bcb-kube-api-access-qt6hs\") pod \"7599663a-37ff-483e-9f03-8e3df5851bcb\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " Apr 16 22:22:38.568808 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:38.568659 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7599663a-37ff-483e-9f03-8e3df5851bcb-data\") pod \"7599663a-37ff-483e-9f03-8e3df5851bcb\" (UID: \"7599663a-37ff-483e-9f03-8e3df5851bcb\") " Apr 16 22:22:38.569853 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:38.569826 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7599663a-37ff-483e-9f03-8e3df5851bcb-data" (OuterVolumeSpecName: "data") pod "7599663a-37ff-483e-9f03-8e3df5851bcb" (UID: "7599663a-37ff-483e-9f03-8e3df5851bcb"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:38.570718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:38.570692 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7599663a-37ff-483e-9f03-8e3df5851bcb-kube-api-access-qt6hs" (OuterVolumeSpecName: "kube-api-access-qt6hs") pod "7599663a-37ff-483e-9f03-8e3df5851bcb" (UID: "7599663a-37ff-483e-9f03-8e3df5851bcb"). InnerVolumeSpecName "kube-api-access-qt6hs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:38.669237 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:38.669205 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qt6hs\" (UniqueName: \"kubernetes.io/projected/7599663a-37ff-483e-9f03-8e3df5851bcb-kube-api-access-qt6hs\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:22:38.669237 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:38.669237 2576 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7599663a-37ff-483e-9f03-8e3df5851bcb-data\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:22:39.129749 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.129709 2576 generic.go:358] "Generic (PLEG): container finished" podID="7599663a-37ff-483e-9f03-8e3df5851bcb" containerID="b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5" exitCode=0 Apr 16 22:22:39.129918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.129769 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" Apr 16 22:22:39.129918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.129788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" event={"ID":"7599663a-37ff-483e-9f03-8e3df5851bcb","Type":"ContainerDied","Data":"b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5"} Apr 16 22:22:39.129918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.129826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl" event={"ID":"7599663a-37ff-483e-9f03-8e3df5851bcb","Type":"ContainerDied","Data":"11a2e02582007d882859562daa313166cf470c9105c3df16adf471d304e38488"} Apr 16 22:22:39.129918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.129841 2576 scope.go:117] "RemoveContainer" containerID="b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5" Apr 16 22:22:39.138672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.138657 2576 scope.go:117] "RemoveContainer" containerID="b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5" Apr 16 22:22:39.138893 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:22:39.138876 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5\": container with ID starting with b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5 not found: ID does not exist" containerID="b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5" Apr 16 22:22:39.138936 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.138903 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5"} err="failed to get container status \"b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5\": rpc error: code = NotFound desc = could not find container \"b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5\": container with ID starting with b8a740b8df312ae6e42ca1c41e7cd73aac7626fc85ae64fc3d4ac6aeabba88d5 not found: ID does not exist" Apr 16 22:22:39.149891 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.149864 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl"] Apr 16 22:22:39.153514 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.153493 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-t8ffl"] Apr 16 22:22:39.180957 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.180931 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs"] Apr 16 22:22:39.181239 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.181227 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7599663a-37ff-483e-9f03-8e3df5851bcb" containerName="seaweedfs-tls-custom" Apr 16 22:22:39.181283 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.181241 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7599663a-37ff-483e-9f03-8e3df5851bcb" containerName="seaweedfs-tls-custom" Apr 16 22:22:39.181316 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.181308 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7599663a-37ff-483e-9f03-8e3df5851bcb" containerName="seaweedfs-tls-custom" Apr 16 22:22:39.184155 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.184137 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.186491 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.186474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 22:22:39.186583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.186478 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 22:22:39.191495 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.191475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs"] Apr 16 22:22:39.274110 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.274071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/8c535426-ec52-40a4-8031-7068e4ce28d1-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.274110 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.274114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8c535426-ec52-40a4-8031-7068e4ce28d1-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.274355 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.274144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bzdn\" (UniqueName: \"kubernetes.io/projected/8c535426-ec52-40a4-8031-7068e4ce28d1-kube-api-access-2bzdn\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.375587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.375548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/8c535426-ec52-40a4-8031-7068e4ce28d1-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.375587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.375588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8c535426-ec52-40a4-8031-7068e4ce28d1-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.375758 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.375620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bzdn\" (UniqueName: \"kubernetes.io/projected/8c535426-ec52-40a4-8031-7068e4ce28d1-kube-api-access-2bzdn\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.376096 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.376072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8c535426-ec52-40a4-8031-7068e4ce28d1-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.378129 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.378107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/8c535426-ec52-40a4-8031-7068e4ce28d1-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.383674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.383616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bzdn\" (UniqueName: \"kubernetes.io/projected/8c535426-ec52-40a4-8031-7068e4ce28d1-kube-api-access-2bzdn\") pod \"seaweedfs-tls-custom-5c88b85bb7-qzgvs\" (UID: \"8c535426-ec52-40a4-8031-7068e4ce28d1\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.453024 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.452989 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7599663a-37ff-483e-9f03-8e3df5851bcb" path="/var/lib/kubelet/pods/7599663a-37ff-483e-9f03-8e3df5851bcb/volumes" Apr 16 22:22:39.494608 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.494578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" Apr 16 22:22:39.615308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:39.615272 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs"] Apr 16 22:22:39.618744 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:22:39.618716 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c535426_ec52_40a4_8031_7068e4ce28d1.slice/crio-6fd981419288b4b7e11c0a04fea357e087331e7aeef2ef29a6edb07244f109bf WatchSource:0}: Error finding container 6fd981419288b4b7e11c0a04fea357e087331e7aeef2ef29a6edb07244f109bf: Status 404 returned error can't find the container with id 6fd981419288b4b7e11c0a04fea357e087331e7aeef2ef29a6edb07244f109bf Apr 16 22:22:40.134223 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:40.134128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" event={"ID":"8c535426-ec52-40a4-8031-7068e4ce28d1","Type":"ContainerStarted","Data":"38f5ea4c78512e4f0ef52ecb196c87c7d113b9a16e22d51853316aaeab28ad69"} Apr 16 22:22:40.134223 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:40.134167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" event={"ID":"8c535426-ec52-40a4-8031-7068e4ce28d1","Type":"ContainerStarted","Data":"6fd981419288b4b7e11c0a04fea357e087331e7aeef2ef29a6edb07244f109bf"} Apr 16 22:22:40.150662 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:40.150609 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-qzgvs" podStartSLOduration=0.890654165 podStartE2EDuration="1.150596258s" podCreationTimestamp="2026-04-16 22:22:39 +0000 UTC" firstStartedPulling="2026-04-16 22:22:39.62001148 +0000 UTC m=+538.744635744" lastFinishedPulling="2026-04-16 22:22:39.879953566 +0000 UTC m=+539.004577837" observedRunningTime="2026-04-16 22:22:40.14957164 +0000 UTC m=+539.274195930" watchObservedRunningTime="2026-04-16 22:22:40.150596258 +0000 UTC m=+539.275220545" Apr 16 22:22:48.122219 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.122183 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-f57cn"] Apr 16 22:22:48.124878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.124859 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.127234 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.127207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 22:22:48.127360 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.127207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 22:22:48.134256 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.134236 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-f57cn"] Apr 16 22:22:48.244509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.244462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.244682 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.244520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6r6\" (UniqueName: \"kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-kube-api-access-vj6r6\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.244682 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.244584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/be52c574-4b03-4c0b-816f-4b024c5aa785-data\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.345578 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.345536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.345578 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.345584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6r6\" (UniqueName: \"kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-kube-api-access-vj6r6\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.345772 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:22:48.345676 2576 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 16 22:22:48.345772 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:22:48.345697 2576 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-f57cn: secret "seaweedfs-tls-serving" not found Apr 16 22:22:48.345772 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.345694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/be52c574-4b03-4c0b-816f-4b024c5aa785-data\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.345772 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:22:48.345748 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-seaweedfs-tls-serving podName:be52c574-4b03-4c0b-816f-4b024c5aa785 nodeName:}" failed. No retries permitted until 2026-04-16 22:22:48.845727988 +0000 UTC m=+547.970352253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-f57cn" (UID: "be52c574-4b03-4c0b-816f-4b024c5aa785") : secret "seaweedfs-tls-serving" not found Apr 16 22:22:48.346009 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.345992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/be52c574-4b03-4c0b-816f-4b024c5aa785-data\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.354273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.354252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6r6\" (UniqueName: \"kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-kube-api-access-vj6r6\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.850131 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.850090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:48.852522 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:48.852495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/be52c574-4b03-4c0b-816f-4b024c5aa785-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-f57cn\" (UID: \"be52c574-4b03-4c0b-816f-4b024c5aa785\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:49.034507 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:49.034472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" Apr 16 22:22:49.161916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:49.161883 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-f57cn"] Apr 16 22:22:49.166613 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:22:49.166577 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe52c574_4b03_4c0b_816f_4b024c5aa785.slice/crio-d0923904ac9b2d4abceeffb6a944aa9d9c29f927f0c77c21f6364487cd897ffe WatchSource:0}: Error finding container d0923904ac9b2d4abceeffb6a944aa9d9c29f927f0c77c21f6364487cd897ffe: Status 404 returned error can't find the container with id d0923904ac9b2d4abceeffb6a944aa9d9c29f927f0c77c21f6364487cd897ffe Apr 16 22:22:50.167787 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:50.167747 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" event={"ID":"be52c574-4b03-4c0b-816f-4b024c5aa785","Type":"ContainerStarted","Data":"ae4228dbf0b0db59077064f888dda43f36f4483cad829e69656d7bd09d7bf2f9"} Apr 16 22:22:50.168152 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:22:50.167791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" event={"ID":"be52c574-4b03-4c0b-816f-4b024c5aa785","Type":"ContainerStarted","Data":"d0923904ac9b2d4abceeffb6a944aa9d9c29f927f0c77c21f6364487cd897ffe"} Apr 16 22:23:08.978954 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.978876 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-f57cn" podStartSLOduration=20.706327339 podStartE2EDuration="20.978856171s" podCreationTimestamp="2026-04-16 22:22:48 +0000 UTC" firstStartedPulling="2026-04-16 22:22:49.167825801 +0000 UTC m=+548.292450066" lastFinishedPulling="2026-04-16 22:22:49.440354632 +0000 UTC m=+548.564978898" observedRunningTime="2026-04-16 22:22:50.197628921 +0000 UTC m=+549.322253221" watchObservedRunningTime="2026-04-16 22:23:08.978856171 +0000 UTC m=+568.103480463" Apr 16 22:23:08.980116 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.980082 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq"] Apr 16 22:23:08.982769 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.982749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:08.985522 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.985498 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 16 22:23:08.985642 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.985513 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-f9tz9\"" Apr 16 22:23:08.985714 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.985584 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:23:08.985714 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.985591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:23:08.985884 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.985592 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 16 22:23:08.997192 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:08.997161 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq"] Apr 16 22:23:09.105093 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.105060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.105255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.105101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.105255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.105147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.105255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.105186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx4nd\" (UniqueName: \"kubernetes.io/projected/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kube-api-access-gx4nd\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.205577 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.205545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.205724 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.205587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.205724 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.205646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.205724 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.205688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx4nd\" (UniqueName: \"kubernetes.io/projected/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kube-api-access-gx4nd\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.205882 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:23:09.205782 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-serving-cert: secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 16 22:23:09.205882 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:23:09.205857 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls podName:7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a nodeName:}" failed. No retries permitted until 2026-04-16 22:23:09.7058411 +0000 UTC m=+568.830465383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls") pod "isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" (UID: "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a") : secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 16 22:23:09.206018 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.205999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.206292 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.206274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.214573 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.214545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx4nd\" (UniqueName: \"kubernetes.io/projected/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kube-api-access-gx4nd\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.709652 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.709614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.712040 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.712022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:09.893392 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:09.893356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:10.014950 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:10.014923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq"] Apr 16 22:23:10.017519 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:23:10.017487 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e6fc2d5_a1c9_4aab_a069_e5f89a741f6a.slice/crio-61c11a7e991094c8e35007d2882f8c50b0bc2a94cf89f10541fd044bb6677666 WatchSource:0}: Error finding container 61c11a7e991094c8e35007d2882f8c50b0bc2a94cf89f10541fd044bb6677666: Status 404 returned error can't find the container with id 61c11a7e991094c8e35007d2882f8c50b0bc2a94cf89f10541fd044bb6677666 Apr 16 22:23:10.231867 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:10.231784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerStarted","Data":"61c11a7e991094c8e35007d2882f8c50b0bc2a94cf89f10541fd044bb6677666"} Apr 16 22:23:13.243729 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:13.243691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerStarted","Data":"cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f"} Apr 16 22:23:17.257653 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:17.257622 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerID="cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f" exitCode=0 Apr 16 22:23:17.258032 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:17.257699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerDied","Data":"cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f"} Apr 16 22:23:25.734393 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:25.734357 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67b4f68575-44mc5"] Apr 16 22:23:30.318662 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:30.318629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerStarted","Data":"2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8"} Apr 16 22:23:32.327490 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:32.327454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerStarted","Data":"887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487"} Apr 16 22:23:35.341853 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:35.341815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerStarted","Data":"65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c"} Apr 16 22:23:35.342369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:35.342021 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:35.361984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:35.361933 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podStartSLOduration=3.158254407 podStartE2EDuration="27.36191737s" podCreationTimestamp="2026-04-16 22:23:08 +0000 UTC" firstStartedPulling="2026-04-16 22:23:10.019507686 +0000 UTC m=+569.144131955" lastFinishedPulling="2026-04-16 22:23:34.223170648 +0000 UTC m=+593.347794918" observedRunningTime="2026-04-16 22:23:35.360240501 +0000 UTC m=+594.484864788" watchObservedRunningTime="2026-04-16 22:23:35.36191737 +0000 UTC m=+594.486541656" Apr 16 22:23:36.345809 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:36.345775 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:36.345809 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:36.345814 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:36.347166 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:36.347134 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:23:36.347951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:36.347922 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:36.350703 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:36.350684 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:23:37.349001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:37.348959 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:23:37.349395 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:37.349373 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:38.352282 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:38.352236 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:23:38.352737 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:38.352546 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:48.353011 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:48.352964 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:23:48.355696 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:48.353430 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:50.758423 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:50.758376 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67b4f68575-44mc5" podUID="984d97bd-2515-4b90-9eb2-cad3c0ac81f0" containerName="console" containerID="cri-o://169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7" gracePeriod=15 Apr 16 22:23:50.995457 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:50.995438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67b4f68575-44mc5_984d97bd-2515-4b90-9eb2-cad3c0ac81f0/console/0.log" Apr 16 22:23:50.995571 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:50.995494 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:23:51.062463 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062372 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-trusted-ca-bundle\") pod \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " Apr 16 22:23:51.062463 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062432 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-oauth-config\") pod \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " Apr 16 22:23:51.062463 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062453 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-oauth-serving-cert\") pod \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " Apr 16 22:23:51.062727 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062510 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6mjg\" (UniqueName: \"kubernetes.io/projected/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-kube-api-access-x6mjg\") pod \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " Apr 16 22:23:51.062727 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062540 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-serving-cert\") pod \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " Apr 16 22:23:51.062727 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-config\") pod \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " Apr 16 22:23:51.062727 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-service-ca\") pod \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\" (UID: \"984d97bd-2515-4b90-9eb2-cad3c0ac81f0\") " Apr 16 22:23:51.062961 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062815 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "984d97bd-2515-4b90-9eb2-cad3c0ac81f0" (UID: "984d97bd-2515-4b90-9eb2-cad3c0ac81f0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:51.063032 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.062912 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "984d97bd-2515-4b90-9eb2-cad3c0ac81f0" (UID: "984d97bd-2515-4b90-9eb2-cad3c0ac81f0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:51.063088 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.063031 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-service-ca" (OuterVolumeSpecName: "service-ca") pod "984d97bd-2515-4b90-9eb2-cad3c0ac81f0" (UID: "984d97bd-2515-4b90-9eb2-cad3c0ac81f0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:51.063169 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.063147 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-config" (OuterVolumeSpecName: "console-config") pod "984d97bd-2515-4b90-9eb2-cad3c0ac81f0" (UID: "984d97bd-2515-4b90-9eb2-cad3c0ac81f0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:51.064674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.064648 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "984d97bd-2515-4b90-9eb2-cad3c0ac81f0" (UID: "984d97bd-2515-4b90-9eb2-cad3c0ac81f0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:23:51.064803 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.064714 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "984d97bd-2515-4b90-9eb2-cad3c0ac81f0" (UID: "984d97bd-2515-4b90-9eb2-cad3c0ac81f0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:23:51.064803 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.064774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-kube-api-access-x6mjg" (OuterVolumeSpecName: "kube-api-access-x6mjg") pod "984d97bd-2515-4b90-9eb2-cad3c0ac81f0" (UID: "984d97bd-2515-4b90-9eb2-cad3c0ac81f0"). InnerVolumeSpecName "kube-api-access-x6mjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:23:51.163489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.163445 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-oauth-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:23:51.163489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.163484 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-oauth-serving-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:23:51.163489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.163494 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6mjg\" (UniqueName: \"kubernetes.io/projected/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-kube-api-access-x6mjg\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:23:51.163489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.163504 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-serving-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:23:51.163750 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.163514 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-console-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:23:51.163750 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.163524 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-service-ca\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:23:51.163750 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.163532 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/984d97bd-2515-4b90-9eb2-cad3c0ac81f0-trusted-ca-bundle\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:23:51.389388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.389299 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67b4f68575-44mc5_984d97bd-2515-4b90-9eb2-cad3c0ac81f0/console/0.log" Apr 16 22:23:51.389388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.389361 2576 generic.go:358] "Generic (PLEG): container finished" podID="984d97bd-2515-4b90-9eb2-cad3c0ac81f0" containerID="169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7" exitCode=2 Apr 16 22:23:51.389562 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.389429 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67b4f68575-44mc5" Apr 16 22:23:51.389562 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.389444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67b4f68575-44mc5" event={"ID":"984d97bd-2515-4b90-9eb2-cad3c0ac81f0","Type":"ContainerDied","Data":"169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7"} Apr 16 22:23:51.389562 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.389486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67b4f68575-44mc5" event={"ID":"984d97bd-2515-4b90-9eb2-cad3c0ac81f0","Type":"ContainerDied","Data":"3263d199fc4a609a19ddaa043fb9a6436f3b7f3bf697c91e84b9d7db0604d1fb"} Apr 16 22:23:51.389562 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.389507 2576 scope.go:117] "RemoveContainer" containerID="169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7" Apr 16 22:23:51.397439 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.397394 2576 scope.go:117] "RemoveContainer" containerID="169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7" Apr 16 22:23:51.397653 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:23:51.397633 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7\": container with ID starting with 169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7 not found: ID does not exist" containerID="169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7" Apr 16 22:23:51.397700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.397671 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7"} err="failed to get container status \"169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7\": rpc error: code = NotFound desc = could not find container \"169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7\": container with ID starting with 169107ca2459ce948a73bf7c1ef2e7c12bfb62feb5a7b369e4f9ab37b9b7ddc7 not found: ID does not exist" Apr 16 22:23:51.409889 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.409868 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67b4f68575-44mc5"] Apr 16 22:23:51.413599 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.413579 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67b4f68575-44mc5"] Apr 16 22:23:51.452717 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:51.452691 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984d97bd-2515-4b90-9eb2-cad3c0ac81f0" path="/var/lib/kubelet/pods/984d97bd-2515-4b90-9eb2-cad3c0ac81f0/volumes" Apr 16 22:23:58.352261 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:58.352212 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:23:58.352775 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:23:58.352618 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:08.353213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:08.353110 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:24:08.353669 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:08.353550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:18.352312 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:18.352206 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:24:18.352811 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:18.352699 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:28.352288 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:28.352242 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:24:28.352799 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:28.352774 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:38.353025 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:38.352991 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:24:38.353485 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:38.353213 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:24:44.064809 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.064777 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq"] Apr 16 22:24:44.065222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.065150 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" containerID="cri-o://2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8" gracePeriod=30 Apr 16 22:24:44.065222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.065164 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" containerID="cri-o://65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c" gracePeriod=30 Apr 16 22:24:44.065365 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.065250 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" containerID="cri-o://887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487" gracePeriod=30 Apr 16 22:24:44.170852 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.170815 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz"] Apr 16 22:24:44.171189 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.171175 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="984d97bd-2515-4b90-9eb2-cad3c0ac81f0" containerName="console" Apr 16 22:24:44.171232 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.171191 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="984d97bd-2515-4b90-9eb2-cad3c0ac81f0" containerName="console" Apr 16 22:24:44.171265 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.171256 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="984d97bd-2515-4b90-9eb2-cad3c0ac81f0" containerName="console" Apr 16 22:24:44.175636 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.175611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.177805 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.177781 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 16 22:24:44.178015 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.178000 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 16 22:24:44.183921 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.183887 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz"] Apr 16 22:24:44.301136 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.301090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.301369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.301153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.301369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.301184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kube-api-access-zdhbv\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.301369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.301240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.401878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.401785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.401878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.401843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.401878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.401874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kube-api-access-zdhbv\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.402160 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.401933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.402160 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:24:44.402084 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-serving-cert: secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 16 22:24:44.402160 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:24:44.402161 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls podName:f1bdae3c-23ef-453e-aac0-dfaef0c627b8 nodeName:}" failed. No retries permitted until 2026-04-16 22:24:44.902139964 +0000 UTC m=+664.026764231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls") pod "isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" (UID: "f1bdae3c-23ef-453e-aac0-dfaef0c627b8") : secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 16 22:24:44.402362 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.402190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.402544 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.402523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.412924 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.412895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kube-api-access-zdhbv\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.567513 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.567481 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerID="887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487" exitCode=2 Apr 16 22:24:44.567666 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.567533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerDied","Data":"887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487"} Apr 16 22:24:44.906754 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.906716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:44.909122 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:44.909094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:45.087180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:45.087142 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:45.210092 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:45.209911 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz"] Apr 16 22:24:45.212371 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:24:45.212341 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1bdae3c_23ef_453e_aac0_dfaef0c627b8.slice/crio-5721ee035c09031a52dde1040839a3ca48b493b807d152687e73c2d84b9c579f WatchSource:0}: Error finding container 5721ee035c09031a52dde1040839a3ca48b493b807d152687e73c2d84b9c579f: Status 404 returned error can't find the container with id 5721ee035c09031a52dde1040839a3ca48b493b807d152687e73c2d84b9c579f Apr 16 22:24:45.214259 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:45.214244 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:24:45.572227 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:45.572196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerStarted","Data":"6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f"} Apr 16 22:24:45.572227 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:45.572229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerStarted","Data":"5721ee035c09031a52dde1040839a3ca48b493b807d152687e73c2d84b9c579f"} Apr 16 22:24:46.347005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:46.346957 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 16 22:24:48.352249 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:48.352200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:24:48.352723 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:48.352581 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:48.584160 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:48.584076 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerID="2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8" exitCode=0 Apr 16 22:24:48.584364 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:48.584146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerDied","Data":"2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8"} Apr 16 22:24:49.588869 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:49.588837 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerID="6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f" exitCode=0 Apr 16 22:24:49.589378 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:49.588890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerDied","Data":"6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f"} Apr 16 22:24:50.594161 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.594126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerStarted","Data":"cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7"} Apr 16 22:24:50.594161 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.594162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerStarted","Data":"7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb"} Apr 16 22:24:50.594645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.594171 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerStarted","Data":"32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478"} Apr 16 22:24:50.594645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.594479 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:50.594645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.594586 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:50.594645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.594613 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:50.596041 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.596013 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:24:50.596762 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.596740 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:50.613339 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:50.613286 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podStartSLOduration=6.613273639 podStartE2EDuration="6.613273639s" podCreationTimestamp="2026-04-16 22:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:24:50.61242765 +0000 UTC m=+669.737051938" watchObservedRunningTime="2026-04-16 22:24:50.613273639 +0000 UTC m=+669.737897926" Apr 16 22:24:51.346423 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:51.346381 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 16 22:24:51.596980 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:51.596875 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:24:51.597385 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:51.597362 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:56.346522 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:56.346478 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 16 22:24:56.346892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:56.346616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:24:56.601208 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:56.601121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:24:56.601783 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:56.601745 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:24:56.601962 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:56.601938 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:58.353186 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:58.353143 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:24:58.353665 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:24:58.353499 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:01.346525 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:01.346486 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 16 22:25:06.346664 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:06.346620 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 16 22:25:06.601856 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:06.601761 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:25:06.602239 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:06.602216 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:08.352696 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:08.352650 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:25:08.353137 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:08.352823 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:25:08.353137 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:08.353014 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:08.353137 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:08.353118 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:25:11.346508 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:11.346466 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 16 22:25:14.204483 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.204450 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:25:14.251550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.251523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls\") pod \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " Apr 16 22:25:14.251706 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.251557 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " Apr 16 22:25:14.251706 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.251591 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kserve-provision-location\") pod \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " Apr 16 22:25:14.251706 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.251628 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx4nd\" (UniqueName: \"kubernetes.io/projected/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kube-api-access-gx4nd\") pod \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\" (UID: \"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a\") " Apr 16 22:25:14.251927 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.251907 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" (UID: "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:14.251966 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.251927 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" (UID: "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:25:14.253789 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.253764 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kube-api-access-gx4nd" (OuterVolumeSpecName: "kube-api-access-gx4nd") pod "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" (UID: "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a"). InnerVolumeSpecName "kube-api-access-gx4nd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:14.253902 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.253851 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" (UID: "7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:14.352826 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.352756 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:25:14.352826 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.352781 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:25:14.352826 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.352791 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:25:14.352826 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.352802 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gx4nd\" (UniqueName: \"kubernetes.io/projected/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a-kube-api-access-gx4nd\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:25:14.670355 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.670254 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerID="65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c" exitCode=0 Apr 16 22:25:14.670355 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.670308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerDied","Data":"65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c"} Apr 16 22:25:14.670355 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.670354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" event={"ID":"7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a","Type":"ContainerDied","Data":"61c11a7e991094c8e35007d2882f8c50b0bc2a94cf89f10541fd044bb6677666"} Apr 16 22:25:14.670601 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.670371 2576 scope.go:117] "RemoveContainer" containerID="65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c" Apr 16 22:25:14.670601 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.670374 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq" Apr 16 22:25:14.680278 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.680179 2576 scope.go:117] "RemoveContainer" containerID="887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487" Apr 16 22:25:14.687504 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.687485 2576 scope.go:117] "RemoveContainer" containerID="2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8" Apr 16 22:25:14.691653 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.691629 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq"] Apr 16 22:25:14.699384 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.697813 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-cv6vq"] Apr 16 22:25:14.699859 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.699827 2576 scope.go:117] "RemoveContainer" containerID="cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f" Apr 16 22:25:14.706853 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.706831 2576 scope.go:117] "RemoveContainer" containerID="65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c" Apr 16 22:25:14.707191 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:25:14.707163 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c\": container with ID starting with 65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c not found: ID does not exist" containerID="65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c" Apr 16 22:25:14.707261 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.707212 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c"} err="failed to get container status \"65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c\": rpc error: code = NotFound desc = could not find container \"65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c\": container with ID starting with 65173c297e799de30cbf577a2967e188a55cda8686f93e471e0bc5bca1c3075c not found: ID does not exist" Apr 16 22:25:14.707346 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.707267 2576 scope.go:117] "RemoveContainer" containerID="887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487" Apr 16 22:25:14.707584 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:25:14.707567 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487\": container with ID starting with 887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487 not found: ID does not exist" containerID="887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487" Apr 16 22:25:14.707632 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.707590 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487"} err="failed to get container status \"887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487\": rpc error: code = NotFound desc = could not find container \"887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487\": container with ID starting with 887d58e2c9e2a1ae627f1a25b32cf8a5fd1855a827d7b36c73637ba1a3c41487 not found: ID does not exist" Apr 16 22:25:14.707632 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.707605 2576 scope.go:117] "RemoveContainer" containerID="2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8" Apr 16 22:25:14.707828 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:25:14.707809 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8\": container with ID starting with 2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8 not found: ID does not exist" containerID="2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8" Apr 16 22:25:14.707870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.707835 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8"} err="failed to get container status \"2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8\": rpc error: code = NotFound desc = could not find container \"2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8\": container with ID starting with 2d49e61f92398ae6e420b03864ef2b87e142c24f4266be09c3552799f6821ae8 not found: ID does not exist" Apr 16 22:25:14.707870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.707851 2576 scope.go:117] "RemoveContainer" containerID="cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f" Apr 16 22:25:14.708045 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:25:14.708028 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f\": container with ID starting with cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f not found: ID does not exist" containerID="cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f" Apr 16 22:25:14.708087 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:14.708055 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f"} err="failed to get container status \"cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f\": rpc error: code = NotFound desc = could not find container \"cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f\": container with ID starting with cf86e5f76eee53c891287b93c4f674077731dde09be0e8e28b9e25120410c80f not found: ID does not exist" Apr 16 22:25:15.453146 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:15.453102 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" path="/var/lib/kubelet/pods/7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a/volumes" Apr 16 22:25:16.602068 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:16.602028 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:25:16.602502 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:16.602479 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:26.601803 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:26.601765 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:25:26.602283 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:26.602212 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:36.601632 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:36.601542 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:25:36.602085 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:36.602057 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:46.602126 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:46.602086 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:25:46.602626 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:46.602604 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:56.603094 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:56.603060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:25:56.603535 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:25:56.603208 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:26:09.245535 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.245506 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz"] Apr 16 22:26:09.246056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.245825 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" containerID="cri-o://32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478" gracePeriod=30 Apr 16 22:26:09.246056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.245891 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" containerID="cri-o://7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb" gracePeriod=30 Apr 16 22:26:09.246056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.245874 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" containerID="cri-o://cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7" gracePeriod=30 Apr 16 22:26:09.306890 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.306853 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz"] Apr 16 22:26:09.307260 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307244 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="storage-initializer" Apr 16 22:26:09.307303 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307263 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="storage-initializer" Apr 16 22:26:09.307303 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307274 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" Apr 16 22:26:09.307303 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307283 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" Apr 16 22:26:09.307303 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307296 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" Apr 16 22:26:09.307303 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307302 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" Apr 16 22:26:09.307480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307313 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" Apr 16 22:26:09.307480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307319 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" Apr 16 22:26:09.307480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307398 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kserve-container" Apr 16 22:26:09.307480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307408 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="agent" Apr 16 22:26:09.307480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.307415 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e6fc2d5-a1c9-4aab-a069-e5f89a741f6a" containerName="kube-rbac-proxy" Apr 16 22:26:09.310489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.310472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.313011 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.312990 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 16 22:26:09.313133 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.313039 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 16 22:26:09.320360 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.320315 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz"] Apr 16 22:26:09.389873 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.389841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89bb\" (UniqueName: \"kubernetes.io/projected/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-kube-api-access-v89bb\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.390001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.389930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.390001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.389978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.491254 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.491207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.491451 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.491270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.491451 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.491366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v89bb\" (UniqueName: \"kubernetes.io/projected/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-kube-api-access-v89bb\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.491546 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:26:09.491449 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 16 22:26:09.491546 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:26:09.491514 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls podName:99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2 nodeName:}" failed. No retries permitted until 2026-04-16 22:26:09.991497353 +0000 UTC m=+749.116121617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-dqtxz" (UID: "99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2") : secret "message-dumper-predictor-serving-cert" not found Apr 16 22:26:09.491961 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.491942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.499764 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.499706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89bb\" (UniqueName: \"kubernetes.io/projected/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-kube-api-access-v89bb\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.849909 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.849827 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerID="7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb" exitCode=2 Apr 16 22:26:09.849909 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.849897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerDied","Data":"7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb"} Apr 16 22:26:09.995655 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.995619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:09.998080 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:09.998054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-dqtxz\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:10.220817 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:10.220772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:10.352568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:10.352532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz"] Apr 16 22:26:10.355623 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:26:10.355595 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ac1eaf_e0f9_45ba_b6b0_e13c3dbcbcf2.slice/crio-0bf2852d44ae22e7a490fc162bc1123f4c6435b4df7ab3822366d8f571072e73 WatchSource:0}: Error finding container 0bf2852d44ae22e7a490fc162bc1123f4c6435b4df7ab3822366d8f571072e73: Status 404 returned error can't find the container with id 0bf2852d44ae22e7a490fc162bc1123f4c6435b4df7ab3822366d8f571072e73 Apr 16 22:26:10.853698 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:10.853661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" event={"ID":"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2","Type":"ContainerStarted","Data":"0bf2852d44ae22e7a490fc162bc1123f4c6435b4df7ab3822366d8f571072e73"} Apr 16 22:26:11.597847 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:11.597816 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 16 22:26:11.858657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:11.858580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" event={"ID":"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2","Type":"ContainerStarted","Data":"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26"} Apr 16 22:26:11.858657 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:11.858615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" event={"ID":"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2","Type":"ContainerStarted","Data":"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e"} Apr 16 22:26:11.858898 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:11.858739 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:11.876116 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:11.876071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" podStartSLOduration=1.72650301 podStartE2EDuration="2.876057545s" podCreationTimestamp="2026-04-16 22:26:09 +0000 UTC" firstStartedPulling="2026-04-16 22:26:10.357381641 +0000 UTC m=+749.482005905" lastFinishedPulling="2026-04-16 22:26:11.506936175 +0000 UTC m=+750.631560440" observedRunningTime="2026-04-16 22:26:11.875233726 +0000 UTC m=+750.999858029" watchObservedRunningTime="2026-04-16 22:26:11.876057545 +0000 UTC m=+751.000681890" Apr 16 22:26:12.861556 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:12.861523 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:12.863065 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:12.863049 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:13.868938 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:13.868908 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerID="32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478" exitCode=0 Apr 16 22:26:13.869300 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:13.868986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerDied","Data":"32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478"} Apr 16 22:26:16.597807 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:16.597762 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 16 22:26:16.602087 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:16.602051 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:26:16.602352 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:16.602308 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:19.881519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:19.881486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:26:21.597416 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:21.597378 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 16 22:26:21.597812 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:21.597513 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:26:26.597432 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:26.597388 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 16 22:26:26.601908 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:26.601880 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:26:26.602167 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:26.602143 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:29.356125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.356094 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws"] Apr 16 22:26:29.359773 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.359753 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.362059 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.362037 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 16 22:26:29.362059 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.362048 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 16 22:26:29.371533 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.371510 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws"] Apr 16 22:26:29.551699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.551661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9jx\" (UniqueName: \"kubernetes.io/projected/ebecc214-1b9c-4326-9904-91b25d6c60fc-kube-api-access-sn9jx\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.551869 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.551729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebecc214-1b9c-4326-9904-91b25d6c60fc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.551869 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.551773 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebecc214-1b9c-4326-9904-91b25d6c60fc-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.551869 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.551841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebecc214-1b9c-4326-9904-91b25d6c60fc-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.652742 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.652631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9jx\" (UniqueName: \"kubernetes.io/projected/ebecc214-1b9c-4326-9904-91b25d6c60fc-kube-api-access-sn9jx\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.652742 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.652708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebecc214-1b9c-4326-9904-91b25d6c60fc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.652742 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.652740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebecc214-1b9c-4326-9904-91b25d6c60fc-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.652742 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.652758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebecc214-1b9c-4326-9904-91b25d6c60fc-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.653176 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.653152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebecc214-1b9c-4326-9904-91b25d6c60fc-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.653422 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.653402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebecc214-1b9c-4326-9904-91b25d6c60fc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.655253 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.655232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebecc214-1b9c-4326-9904-91b25d6c60fc-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.660870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.660844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9jx\" (UniqueName: \"kubernetes.io/projected/ebecc214-1b9c-4326-9904-91b25d6c60fc-kube-api-access-sn9jx\") pod \"isvc-logger-predictor-64d54fcc88-bsbws\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.669587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.669566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:29.794545 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.794488 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws"] Apr 16 22:26:29.797850 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:26:29.797822 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebecc214_1b9c_4326_9904_91b25d6c60fc.slice/crio-111fd92b466adb0442fb49f74964b72d6c6a17367ec6a0621196a0a8edb57746 WatchSource:0}: Error finding container 111fd92b466adb0442fb49f74964b72d6c6a17367ec6a0621196a0a8edb57746: Status 404 returned error can't find the container with id 111fd92b466adb0442fb49f74964b72d6c6a17367ec6a0621196a0a8edb57746 Apr 16 22:26:29.921365 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.921264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerStarted","Data":"3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b"} Apr 16 22:26:29.921365 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:29.921304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerStarted","Data":"111fd92b466adb0442fb49f74964b72d6c6a17367ec6a0621196a0a8edb57746"} Apr 16 22:26:31.597275 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:31.597237 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 16 22:26:33.936258 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:33.936167 2576 generic.go:358] "Generic (PLEG): container finished" podID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerID="3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b" exitCode=0 Apr 16 22:26:33.936258 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:33.936227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerDied","Data":"3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b"} Apr 16 22:26:34.942313 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:34.942276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerStarted","Data":"4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5"} Apr 16 22:26:34.942313 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:34.942320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerStarted","Data":"ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135"} Apr 16 22:26:34.942313 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:34.942344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerStarted","Data":"d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12"} Apr 16 22:26:34.942906 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:34.942582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:34.962801 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:34.962749 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podStartSLOduration=5.962733718 podStartE2EDuration="5.962733718s" podCreationTimestamp="2026-04-16 22:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:26:34.960893905 +0000 UTC m=+774.085518191" watchObservedRunningTime="2026-04-16 22:26:34.962733718 +0000 UTC m=+774.087358005" Apr 16 22:26:35.945544 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:35.945501 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:35.945544 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:35.945544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:35.947051 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:35.947022 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:26:35.947733 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:35.947705 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:36.597458 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:36.597416 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 16 22:26:36.601802 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:36.601772 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:5000: connect: connection refused" Apr 16 22:26:36.601940 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:36.601912 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:26:36.602128 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:36.602107 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:36.602198 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:36.602186 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:26:36.948874 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:36.948841 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:26:36.949306 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:36.949286 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:39.402922 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.402900 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:26:39.426133 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.426104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls\") pod \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " Apr 16 22:26:39.426250 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.426187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kube-api-access-zdhbv\") pod \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " Apr 16 22:26:39.426250 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.426224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kserve-provision-location\") pod \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " Apr 16 22:26:39.426373 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.426260 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\" (UID: \"f1bdae3c-23ef-453e-aac0-dfaef0c627b8\") " Apr 16 22:26:39.426672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.426637 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1bdae3c-23ef-453e-aac0-dfaef0c627b8" (UID: "f1bdae3c-23ef-453e-aac0-dfaef0c627b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:26:39.426791 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.426721 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "f1bdae3c-23ef-453e-aac0-dfaef0c627b8" (UID: "f1bdae3c-23ef-453e-aac0-dfaef0c627b8"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:26:39.428759 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.428336 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f1bdae3c-23ef-453e-aac0-dfaef0c627b8" (UID: "f1bdae3c-23ef-453e-aac0-dfaef0c627b8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:26:39.428759 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.428727 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kube-api-access-zdhbv" (OuterVolumeSpecName: "kube-api-access-zdhbv") pod "f1bdae3c-23ef-453e-aac0-dfaef0c627b8" (UID: "f1bdae3c-23ef-453e-aac0-dfaef0c627b8"). InnerVolumeSpecName "kube-api-access-zdhbv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:26:39.527624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.527535 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kube-api-access-zdhbv\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:26:39.527823 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.527794 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:26:39.527823 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.527826 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:26:39.527962 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.527840 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1bdae3c-23ef-453e-aac0-dfaef0c627b8-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:26:39.966172 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.966137 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerID="cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7" exitCode=0 Apr 16 22:26:39.966370 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.966227 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" Apr 16 22:26:39.966370 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.966222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerDied","Data":"cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7"} Apr 16 22:26:39.966370 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.966272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz" event={"ID":"f1bdae3c-23ef-453e-aac0-dfaef0c627b8","Type":"ContainerDied","Data":"5721ee035c09031a52dde1040839a3ca48b493b807d152687e73c2d84b9c579f"} Apr 16 22:26:39.966370 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.966294 2576 scope.go:117] "RemoveContainer" containerID="cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7" Apr 16 22:26:39.974083 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.974063 2576 scope.go:117] "RemoveContainer" containerID="7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb" Apr 16 22:26:39.981039 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.981021 2576 scope.go:117] "RemoveContainer" containerID="32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478" Apr 16 22:26:39.986267 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.986243 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz"] Apr 16 22:26:39.988549 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.988533 2576 scope.go:117] "RemoveContainer" containerID="6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f" Apr 16 22:26:39.992606 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.992583 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-w2mzz"] Apr 16 22:26:39.996517 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.996502 2576 scope.go:117] "RemoveContainer" containerID="cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7" Apr 16 22:26:39.996778 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:26:39.996759 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7\": container with ID starting with cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7 not found: ID does not exist" containerID="cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7" Apr 16 22:26:39.996851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.996789 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7"} err="failed to get container status \"cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7\": rpc error: code = NotFound desc = could not find container \"cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7\": container with ID starting with cbae2e9c02662c50ac2c0288e5efbc8c965acce59ebe141b709739bce40cf5e7 not found: ID does not exist" Apr 16 22:26:39.996851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.996808 2576 scope.go:117] "RemoveContainer" containerID="7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb" Apr 16 22:26:39.997058 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:26:39.997041 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb\": container with ID starting with 7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb not found: ID does not exist" containerID="7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb" Apr 16 22:26:39.997107 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.997067 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb"} err="failed to get container status \"7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb\": rpc error: code = NotFound desc = could not find container \"7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb\": container with ID starting with 7c84ed4bf442b7d3834e3aff5b9f852d41dc339be5178033347158d64b8d6abb not found: ID does not exist" Apr 16 22:26:39.997107 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.997085 2576 scope.go:117] "RemoveContainer" containerID="32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478" Apr 16 22:26:39.997304 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:26:39.997287 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478\": container with ID starting with 32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478 not found: ID does not exist" containerID="32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478" Apr 16 22:26:39.997369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.997309 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478"} err="failed to get container status \"32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478\": rpc error: code = NotFound desc = could not find container \"32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478\": container with ID starting with 32f22c0ad885e5349ad1a6111ce0f34d55c22d1ae542029ebbe87057bbb5c478 not found: ID does not exist" Apr 16 22:26:39.997369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.997338 2576 scope.go:117] "RemoveContainer" containerID="6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f" Apr 16 22:26:39.997570 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:26:39.997554 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f\": container with ID starting with 6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f not found: ID does not exist" containerID="6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f" Apr 16 22:26:39.997625 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:39.997574 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f"} err="failed to get container status \"6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f\": rpc error: code = NotFound desc = could not find container \"6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f\": container with ID starting with 6952f99bf3ee3fa26be2e5cc6e417f547a4c59e0cfb3a9b655e123ad7fa0fe7f not found: ID does not exist" Apr 16 22:26:41.453197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:41.453167 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" path="/var/lib/kubelet/pods/f1bdae3c-23ef-453e-aac0-dfaef0c627b8/volumes" Apr 16 22:26:41.953544 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:41.953518 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:26:41.954138 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:41.954105 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:26:41.954467 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:41.954445 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:51.954247 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:51.954203 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:26:51.954725 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:26:51.954637 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:01.954935 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:01.954833 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:27:01.955346 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:01.955284 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:11.954685 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:11.954635 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:27:11.955120 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:11.955098 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:21.954656 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:21.954607 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:27:21.955162 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:21.955037 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:31.954256 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:31.954206 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:27:31.954728 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:31.954706 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:41.955252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:41.955215 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:27:41.955666 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:41.955486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:27:54.378483 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.378447 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-dqtxz_99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2/kserve-container/0.log" Apr 16 22:27:54.540609 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.540574 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws"] Apr 16 22:27:54.541477 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.541423 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" containerID="cri-o://d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12" gracePeriod=30 Apr 16 22:27:54.541716 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.541479 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" containerID="cri-o://4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5" gracePeriod=30 Apr 16 22:27:54.541845 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.541509 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" containerID="cri-o://ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135" gracePeriod=30 Apr 16 22:27:54.593064 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593027 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft"] Apr 16 22:27:54.593449 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593432 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" Apr 16 22:27:54.593547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" Apr 16 22:27:54.593547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593465 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="storage-initializer" Apr 16 22:27:54.593547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593473 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="storage-initializer" Apr 16 22:27:54.593547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593489 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" Apr 16 22:27:54.593547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593497 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" Apr 16 22:27:54.593547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593524 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" Apr 16 22:27:54.593547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593533 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" Apr 16 22:27:54.593892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593606 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kserve-container" Apr 16 22:27:54.593892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593621 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="kube-rbac-proxy" Apr 16 22:27:54.593892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.593634 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1bdae3c-23ef-453e-aac0-dfaef0c627b8" containerName="agent" Apr 16 22:27:54.596948 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.596928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.599408 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.599387 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 16 22:27:54.599514 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.599438 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 16 22:27:54.604898 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.604874 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft"] Apr 16 22:27:54.621495 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.621271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.621495 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.621320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.621495 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.621363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gps\" (UniqueName: \"kubernetes.io/projected/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kube-api-access-66gps\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.621495 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.621446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dff259ad-cd80-4128-ad22-6ba1c623a5b7-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.635266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.635209 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz"] Apr 16 22:27:54.635518 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.635492 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kserve-container" containerID="cri-o://5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e" gracePeriod=30 Apr 16 22:27:54.635675 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.635598 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kube-rbac-proxy" containerID="cri-o://5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26" gracePeriod=30 Apr 16 22:27:54.722500 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.722467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.722627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.722504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66gps\" (UniqueName: \"kubernetes.io/projected/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kube-api-access-66gps\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.722627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.722582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dff259ad-cd80-4128-ad22-6ba1c623a5b7-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.722698 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:27:54.722624 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 16 22:27:54.722698 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.722644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.722698 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:27:54.722691 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls podName:dff259ad-cd80-4128-ad22-6ba1c623a5b7 nodeName:}" failed. No retries permitted until 2026-04-16 22:27:55.222670042 +0000 UTC m=+854.347294320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-z5nft" (UID: "dff259ad-cd80-4128-ad22-6ba1c623a5b7") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 16 22:27:54.723006 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.722987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.723231 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.723212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dff259ad-cd80-4128-ad22-6ba1c623a5b7-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.731370 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.731344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gps\" (UniqueName: \"kubernetes.io/projected/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kube-api-access-66gps\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:54.869713 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.869692 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:27:54.924219 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.924153 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v89bb\" (UniqueName: \"kubernetes.io/projected/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-kube-api-access-v89bb\") pod \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " Apr 16 22:27:54.924219 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.924190 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-message-dumper-kube-rbac-proxy-sar-config\") pod \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " Apr 16 22:27:54.924409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.924378 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls\") pod \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\" (UID: \"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2\") " Apr 16 22:27:54.924547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.924528 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" (UID: "99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:27:54.924642 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.924625 2576 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:27:54.926243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.926223 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" (UID: "99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:27:54.926287 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:54.926268 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-kube-api-access-v89bb" (OuterVolumeSpecName: "kube-api-access-v89bb") pod "99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" (UID: "99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2"). InnerVolumeSpecName "kube-api-access-v89bb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:27:55.025870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.025833 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:27:55.025870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.025861 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v89bb\" (UniqueName: \"kubernetes.io/projected/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2-kube-api-access-v89bb\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:27:55.210547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.210513 2576 generic.go:358] "Generic (PLEG): container finished" podID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerID="ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135" exitCode=2 Apr 16 22:27:55.210720 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.210592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerDied","Data":"ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135"} Apr 16 22:27:55.211897 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.211875 2576 generic.go:358] "Generic (PLEG): container finished" podID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerID="5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26" exitCode=2 Apr 16 22:27:55.211897 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.211896 2576 generic.go:358] "Generic (PLEG): container finished" podID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerID="5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e" exitCode=2 Apr 16 22:27:55.212010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.211945 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" Apr 16 22:27:55.212010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.211949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" event={"ID":"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2","Type":"ContainerDied","Data":"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26"} Apr 16 22:27:55.212010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.211980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" event={"ID":"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2","Type":"ContainerDied","Data":"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e"} Apr 16 22:27:55.212010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.211990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz" event={"ID":"99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2","Type":"ContainerDied","Data":"0bf2852d44ae22e7a490fc162bc1123f4c6435b4df7ab3822366d8f571072e73"} Apr 16 22:27:55.212010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.212003 2576 scope.go:117] "RemoveContainer" containerID="5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26" Apr 16 22:27:55.219864 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.219848 2576 scope.go:117] "RemoveContainer" containerID="5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e" Apr 16 22:27:55.226678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.226662 2576 scope.go:117] "RemoveContainer" containerID="5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26" Apr 16 22:27:55.226919 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:27:55.226902 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26\": container with ID starting with 5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26 not found: ID does not exist" containerID="5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26" Apr 16 22:27:55.226967 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.226929 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26"} err="failed to get container status \"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26\": rpc error: code = NotFound desc = could not find container \"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26\": container with ID starting with 5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26 not found: ID does not exist" Apr 16 22:27:55.226967 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.226945 2576 scope.go:117] "RemoveContainer" containerID="5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e" Apr 16 22:27:55.227054 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.226907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:55.227246 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:27:55.227225 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e\": container with ID starting with 5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e not found: ID does not exist" containerID="5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e" Apr 16 22:27:55.227319 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.227250 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e"} err="failed to get container status \"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e\": rpc error: code = NotFound desc = could not find container \"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e\": container with ID starting with 5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e not found: ID does not exist" Apr 16 22:27:55.227319 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.227266 2576 scope.go:117] "RemoveContainer" containerID="5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26" Apr 16 22:27:55.227543 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.227511 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26"} err="failed to get container status \"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26\": rpc error: code = NotFound desc = could not find container \"5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26\": container with ID starting with 5ae70865f3cf491a7f002df7d3278a12a73125e324c6aedb6436bfb101b66b26 not found: ID does not exist" Apr 16 22:27:55.227543 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.227536 2576 scope.go:117] "RemoveContainer" containerID="5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e" Apr 16 22:27:55.227788 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.227770 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e"} err="failed to get container status \"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e\": rpc error: code = NotFound desc = could not find container \"5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e\": container with ID starting with 5b8aba4a1632f22ba7a49eb99413199fefa2c01b3d8280ff4394093e18e9e89e not found: ID does not exist" Apr 16 22:27:55.229374 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.229307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-z5nft\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:55.237494 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.233709 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz"] Apr 16 22:27:55.239872 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.239849 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dqtxz"] Apr 16 22:27:55.452728 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.452693 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" path="/var/lib/kubelet/pods/99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2/volumes" Apr 16 22:27:55.508199 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.508117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:27:55.632223 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:55.632197 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft"] Apr 16 22:27:56.218398 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:56.218351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerStarted","Data":"213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624"} Apr 16 22:27:56.218398 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:56.218387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerStarted","Data":"8e250e00be29655591d5f5018848e46513e2ae25da5dc553726c34777fd414e7"} Apr 16 22:27:56.949256 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:56.949213 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 16 22:27:59.230813 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:59.230783 2576 generic.go:358] "Generic (PLEG): container finished" podID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerID="d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12" exitCode=0 Apr 16 22:27:59.231186 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:27:59.230836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerDied","Data":"d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12"} Apr 16 22:28:00.235270 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:00.235235 2576 generic.go:358] "Generic (PLEG): container finished" podID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerID="213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624" exitCode=0 Apr 16 22:28:00.235717 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:00.235304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerDied","Data":"213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624"} Apr 16 22:28:01.949403 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:01.949353 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 16 22:28:01.955078 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:01.955022 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:28:01.955501 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:01.955467 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:06.949460 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:06.949415 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 16 22:28:06.949916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:06.949607 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:28:07.264540 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:07.264462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerStarted","Data":"695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b"} Apr 16 22:28:07.264540 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:07.264501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerStarted","Data":"f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b"} Apr 16 22:28:07.264839 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:07.264813 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:28:07.283343 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:07.283283 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podStartSLOduration=6.5652302129999995 podStartE2EDuration="13.283270219s" podCreationTimestamp="2026-04-16 22:27:54 +0000 UTC" firstStartedPulling="2026-04-16 22:28:00.236566519 +0000 UTC m=+859.361190784" lastFinishedPulling="2026-04-16 22:28:06.954606512 +0000 UTC m=+866.079230790" observedRunningTime="2026-04-16 22:28:07.282445514 +0000 UTC m=+866.407069802" watchObservedRunningTime="2026-04-16 22:28:07.283270219 +0000 UTC m=+866.407894505" Apr 16 22:28:08.267506 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:08.267471 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:28:08.268720 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:08.268696 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:28:09.275248 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:09.275197 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:28:11.949679 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:11.949633 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 16 22:28:11.954074 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:11.954038 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:28:11.954362 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:11.954316 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:14.279056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:14.279028 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:28:14.279566 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:14.279538 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:28:16.949214 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:16.949170 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 16 22:28:21.949204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:21.949160 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 16 22:28:21.954676 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:21.954639 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 22:28:21.954803 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:21.954778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:28:21.955021 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:21.954995 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:21.955126 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:21.955072 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:28:24.279663 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.279622 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:28:24.737004 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.736978 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:28:24.873235 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.873153 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebecc214-1b9c-4326-9904-91b25d6c60fc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"ebecc214-1b9c-4326-9904-91b25d6c60fc\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " Apr 16 22:28:24.873235 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.873214 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebecc214-1b9c-4326-9904-91b25d6c60fc-kserve-provision-location\") pod \"ebecc214-1b9c-4326-9904-91b25d6c60fc\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " Apr 16 22:28:24.873235 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.873237 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn9jx\" (UniqueName: \"kubernetes.io/projected/ebecc214-1b9c-4326-9904-91b25d6c60fc-kube-api-access-sn9jx\") pod \"ebecc214-1b9c-4326-9904-91b25d6c60fc\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " Apr 16 22:28:24.873481 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.873270 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebecc214-1b9c-4326-9904-91b25d6c60fc-proxy-tls\") pod \"ebecc214-1b9c-4326-9904-91b25d6c60fc\" (UID: \"ebecc214-1b9c-4326-9904-91b25d6c60fc\") " Apr 16 22:28:24.873594 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.873560 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebecc214-1b9c-4326-9904-91b25d6c60fc-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "ebecc214-1b9c-4326-9904-91b25d6c60fc" (UID: "ebecc214-1b9c-4326-9904-91b25d6c60fc"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:28:24.873594 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.873572 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebecc214-1b9c-4326-9904-91b25d6c60fc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ebecc214-1b9c-4326-9904-91b25d6c60fc" (UID: "ebecc214-1b9c-4326-9904-91b25d6c60fc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:24.875388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.875367 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebecc214-1b9c-4326-9904-91b25d6c60fc-kube-api-access-sn9jx" (OuterVolumeSpecName: "kube-api-access-sn9jx") pod "ebecc214-1b9c-4326-9904-91b25d6c60fc" (UID: "ebecc214-1b9c-4326-9904-91b25d6c60fc"). InnerVolumeSpecName "kube-api-access-sn9jx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:28:24.875448 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.875416 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebecc214-1b9c-4326-9904-91b25d6c60fc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ebecc214-1b9c-4326-9904-91b25d6c60fc" (UID: "ebecc214-1b9c-4326-9904-91b25d6c60fc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:24.974244 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.974207 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ebecc214-1b9c-4326-9904-91b25d6c60fc-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:28:24.974244 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.974238 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebecc214-1b9c-4326-9904-91b25d6c60fc-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:28:24.974244 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.974249 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sn9jx\" (UniqueName: \"kubernetes.io/projected/ebecc214-1b9c-4326-9904-91b25d6c60fc-kube-api-access-sn9jx\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:28:24.974486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:24.974259 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebecc214-1b9c-4326-9904-91b25d6c60fc-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:28:25.327566 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.327531 2576 generic.go:358] "Generic (PLEG): container finished" podID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerID="4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5" exitCode=0 Apr 16 22:28:25.327951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.327617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerDied","Data":"4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5"} Apr 16 22:28:25.327951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.327654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" event={"ID":"ebecc214-1b9c-4326-9904-91b25d6c60fc","Type":"ContainerDied","Data":"111fd92b466adb0442fb49f74964b72d6c6a17367ec6a0621196a0a8edb57746"} Apr 16 22:28:25.327951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.327671 2576 scope.go:117] "RemoveContainer" containerID="4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5" Apr 16 22:28:25.327951 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.327625 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws" Apr 16 22:28:25.335115 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.335069 2576 scope.go:117] "RemoveContainer" containerID="ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135" Apr 16 22:28:25.343389 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.343347 2576 scope.go:117] "RemoveContainer" containerID="d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12" Apr 16 22:28:25.349937 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.349921 2576 scope.go:117] "RemoveContainer" containerID="3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b" Apr 16 22:28:25.356698 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.356672 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws"] Apr 16 22:28:25.357765 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.357750 2576 scope.go:117] "RemoveContainer" containerID="4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5" Apr 16 22:28:25.358003 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:28:25.357985 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5\": container with ID starting with 4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5 not found: ID does not exist" containerID="4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5" Apr 16 22:28:25.358058 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.358015 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5"} err="failed to get container status \"4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5\": rpc error: code = NotFound desc = could not find container \"4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5\": container with ID starting with 4056e569c238749606863b092f9a29ba998cc9c249499a482c5bd2eebc4a98d5 not found: ID does not exist" Apr 16 22:28:25.358058 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.358034 2576 scope.go:117] "RemoveContainer" containerID="ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135" Apr 16 22:28:25.358272 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:28:25.358247 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135\": container with ID starting with ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135 not found: ID does not exist" containerID="ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135" Apr 16 22:28:25.358381 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.358283 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135"} err="failed to get container status \"ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135\": rpc error: code = NotFound desc = could not find container \"ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135\": container with ID starting with ffd7c7481c1b66268a3c44da0f9e53cc00965f35787f93c845813b3356f18135 not found: ID does not exist" Apr 16 22:28:25.358381 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.358305 2576 scope.go:117] "RemoveContainer" containerID="d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12" Apr 16 22:28:25.358630 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:28:25.358610 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12\": container with ID starting with d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12 not found: ID does not exist" containerID="d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12" Apr 16 22:28:25.358677 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.358634 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12"} err="failed to get container status \"d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12\": rpc error: code = NotFound desc = could not find container \"d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12\": container with ID starting with d40e75ec42dc7fa3690c9b95cd1be89fe5eaae527b81a4ed5e4ce1a1224bbc12 not found: ID does not exist" Apr 16 22:28:25.358677 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.358649 2576 scope.go:117] "RemoveContainer" containerID="3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b" Apr 16 22:28:25.358873 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:28:25.358857 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b\": container with ID starting with 3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b not found: ID does not exist" containerID="3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b" Apr 16 22:28:25.358916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.358878 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b"} err="failed to get container status \"3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b\": rpc error: code = NotFound desc = could not find container \"3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b\": container with ID starting with 3dfac0e5bf8a19dc434b5711005442a3d59c4687e129c3556d020ef67731be9b not found: ID does not exist" Apr 16 22:28:25.362388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.362365 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-bsbws"] Apr 16 22:28:25.452942 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:25.452911 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" path="/var/lib/kubelet/pods/ebecc214-1b9c-4326-9904-91b25d6c60fc/volumes" Apr 16 22:28:34.279573 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:34.279486 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:28:44.280271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:44.280222 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:28:54.280039 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:28:54.279997 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:29:04.279575 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:04.279536 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:29:14.280105 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:14.280066 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:29:18.449596 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:18.449552 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:29:28.450179 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:28.450143 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:29:34.732553 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.732519 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft"] Apr 16 22:29:34.733030 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.732820 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" containerID="cri-o://f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b" gracePeriod=30 Apr 16 22:29:34.733030 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.732878 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kube-rbac-proxy" containerID="cri-o://695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b" gracePeriod=30 Apr 16 22:29:34.826358 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826306 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm"] Apr 16 22:29:34.826660 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826645 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" Apr 16 22:29:34.826709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826661 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" Apr 16 22:29:34.826709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826675 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kserve-container" Apr 16 22:29:34.826709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826680 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kserve-container" Apr 16 22:29:34.826709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="storage-initializer" Apr 16 22:29:34.826709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826692 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="storage-initializer" Apr 16 22:29:34.826709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826708 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826714 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826722 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kube-rbac-proxy" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826726 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kube-rbac-proxy" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826734 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826739 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826806 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kserve-container" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826814 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="kube-rbac-proxy" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826820 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kube-rbac-proxy" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826826 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebecc214-1b9c-4326-9904-91b25d6c60fc" containerName="agent" Apr 16 22:29:34.826886 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.826833 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="99ac1eaf-e0f9-45ba-b6b0-e13c3dbcbcf2" containerName="kserve-container" Apr 16 22:29:34.831131 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.831113 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:34.833732 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.833708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 16 22:29:34.833830 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.833712 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 16 22:29:34.839651 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.839632 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm"] Apr 16 22:29:34.921577 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.921540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/208c740f-f814-4669-b673-ca97f693d1f7-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:34.921753 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.921598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/208c740f-f814-4669-b673-ca97f693d1f7-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:34.921753 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.921619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/208c740f-f814-4669-b673-ca97f693d1f7-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:34.921753 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:34.921662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f22wr\" (UniqueName: \"kubernetes.io/projected/208c740f-f814-4669-b673-ca97f693d1f7-kube-api-access-f22wr\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.022964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.022878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/208c740f-f814-4669-b673-ca97f693d1f7-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.022964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.022937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/208c740f-f814-4669-b673-ca97f693d1f7-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.022964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.022958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/208c740f-f814-4669-b673-ca97f693d1f7-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.023247 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.022985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22wr\" (UniqueName: \"kubernetes.io/projected/208c740f-f814-4669-b673-ca97f693d1f7-kube-api-access-f22wr\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.023313 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.023292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/208c740f-f814-4669-b673-ca97f693d1f7-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.023675 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.023651 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/208c740f-f814-4669-b673-ca97f693d1f7-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.025409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.025390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/208c740f-f814-4669-b673-ca97f693d1f7-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.034296 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.034266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f22wr\" (UniqueName: \"kubernetes.io/projected/208c740f-f814-4669-b673-ca97f693d1f7-kube-api-access-f22wr\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.143221 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.143173 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:35.264897 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.264860 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm"] Apr 16 22:29:35.267789 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:29:35.267757 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208c740f_f814_4669_b673_ca97f693d1f7.slice/crio-13533ddb1f5121e20c9d3546b03d93d3b7158775c423ccb6494883614a4101d6 WatchSource:0}: Error finding container 13533ddb1f5121e20c9d3546b03d93d3b7158775c423ccb6494883614a4101d6: Status 404 returned error can't find the container with id 13533ddb1f5121e20c9d3546b03d93d3b7158775c423ccb6494883614a4101d6 Apr 16 22:29:35.548584 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.548496 2576 generic.go:358] "Generic (PLEG): container finished" podID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerID="695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b" exitCode=2 Apr 16 22:29:35.548741 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.548574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerDied","Data":"695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b"} Apr 16 22:29:35.549803 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.549781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerStarted","Data":"5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9"} Apr 16 22:29:35.549908 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:35.549809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerStarted","Data":"13533ddb1f5121e20c9d3546b03d93d3b7158775c423ccb6494883614a4101d6"} Apr 16 22:29:38.450390 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:38.450309 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 22:29:39.282517 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.282492 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:29:39.357880 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.357844 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dff259ad-cd80-4128-ad22-6ba1c623a5b7-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " Apr 16 22:29:39.357880 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.357892 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66gps\" (UniqueName: \"kubernetes.io/projected/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kube-api-access-66gps\") pod \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " Apr 16 22:29:39.358102 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.357939 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kserve-provision-location\") pod \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " Apr 16 22:29:39.358102 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.357997 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls\") pod \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\" (UID: \"dff259ad-cd80-4128-ad22-6ba1c623a5b7\") " Apr 16 22:29:39.358271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.358244 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dff259ad-cd80-4128-ad22-6ba1c623a5b7" (UID: "dff259ad-cd80-4128-ad22-6ba1c623a5b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:29:39.358369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.358283 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff259ad-cd80-4128-ad22-6ba1c623a5b7-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "dff259ad-cd80-4128-ad22-6ba1c623a5b7" (UID: "dff259ad-cd80-4128-ad22-6ba1c623a5b7"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:29:39.360146 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.360114 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dff259ad-cd80-4128-ad22-6ba1c623a5b7" (UID: "dff259ad-cd80-4128-ad22-6ba1c623a5b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:29:39.360146 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.360125 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kube-api-access-66gps" (OuterVolumeSpecName: "kube-api-access-66gps") pod "dff259ad-cd80-4128-ad22-6ba1c623a5b7" (UID: "dff259ad-cd80-4128-ad22-6ba1c623a5b7"). InnerVolumeSpecName "kube-api-access-66gps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:29:39.458509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.458481 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dff259ad-cd80-4128-ad22-6ba1c623a5b7-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:29:39.458509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.458506 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-66gps\" (UniqueName: \"kubernetes.io/projected/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kube-api-access-66gps\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:29:39.458509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.458516 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff259ad-cd80-4128-ad22-6ba1c623a5b7-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:29:39.459003 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.458526 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff259ad-cd80-4128-ad22-6ba1c623a5b7-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:29:39.562633 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.562601 2576 generic.go:358] "Generic (PLEG): container finished" podID="208c740f-f814-4669-b673-ca97f693d1f7" containerID="5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9" exitCode=0 Apr 16 22:29:39.562795 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.562670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerDied","Data":"5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9"} Apr 16 22:29:39.564393 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.564368 2576 generic.go:358] "Generic (PLEG): container finished" podID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerID="f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b" exitCode=0 Apr 16 22:29:39.564505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.564405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerDied","Data":"f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b"} Apr 16 22:29:39.564505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.564437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" event={"ID":"dff259ad-cd80-4128-ad22-6ba1c623a5b7","Type":"ContainerDied","Data":"8e250e00be29655591d5f5018848e46513e2ae25da5dc553726c34777fd414e7"} Apr 16 22:29:39.564505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.564449 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" Apr 16 22:29:39.564643 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.564455 2576 scope.go:117] "RemoveContainer" containerID="695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b" Apr 16 22:29:39.572180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.572162 2576 scope.go:117] "RemoveContainer" containerID="f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b" Apr 16 22:29:39.578852 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.578832 2576 scope.go:117] "RemoveContainer" containerID="213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624" Apr 16 22:29:39.586927 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.586887 2576 scope.go:117] "RemoveContainer" containerID="695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b" Apr 16 22:29:39.587178 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:29:39.587161 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b\": container with ID starting with 695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b not found: ID does not exist" containerID="695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b" Apr 16 22:29:39.587244 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.587187 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b"} err="failed to get container status \"695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b\": rpc error: code = NotFound desc = could not find container \"695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b\": container with ID starting with 695f48092f366e3fc5438babc9ce2d3770c8961cbca5db33fa7861aa8521f86b not found: ID does not exist" Apr 16 22:29:39.587244 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.587206 2576 scope.go:117] "RemoveContainer" containerID="f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b" Apr 16 22:29:39.587497 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:29:39.587481 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b\": container with ID starting with f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b not found: ID does not exist" containerID="f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b" Apr 16 22:29:39.587550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.587502 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b"} err="failed to get container status \"f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b\": rpc error: code = NotFound desc = could not find container \"f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b\": container with ID starting with f9ee24762db0c94b5f4cc3c25d3311a324153228145f518134ec01a9bebcff0b not found: ID does not exist" Apr 16 22:29:39.587550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.587523 2576 scope.go:117] "RemoveContainer" containerID="213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624" Apr 16 22:29:39.587776 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:29:39.587759 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624\": container with ID starting with 213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624 not found: ID does not exist" containerID="213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624" Apr 16 22:29:39.587822 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.587781 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624"} err="failed to get container status \"213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624\": rpc error: code = NotFound desc = could not find container \"213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624\": container with ID starting with 213f90877afd28e11bce7f69e45d5357531246718a2c2427ccab1a2fcb114624 not found: ID does not exist" Apr 16 22:29:39.592319 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.592298 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft"] Apr 16 22:29:39.596062 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:39.596040 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft"] Apr 16 22:29:40.276718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:40.276670 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-z5nft" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 16 22:29:40.568981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:40.568884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerStarted","Data":"3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f"} Apr 16 22:29:40.568981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:40.568928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerStarted","Data":"6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3"} Apr 16 22:29:40.569496 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:40.569245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:40.569496 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:40.569388 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:40.570673 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:40.570649 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:29:40.588123 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:40.588071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podStartSLOduration=6.588057426 podStartE2EDuration="6.588057426s" podCreationTimestamp="2026-04-16 22:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:29:40.586533165 +0000 UTC m=+959.711157451" watchObservedRunningTime="2026-04-16 22:29:40.588057426 +0000 UTC m=+959.712681713" Apr 16 22:29:41.452763 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:41.452715 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" path="/var/lib/kubelet/pods/dff259ad-cd80-4128-ad22-6ba1c623a5b7/volumes" Apr 16 22:29:41.573167 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:41.573125 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:29:46.577475 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:46.577449 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:29:46.577957 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:46.577925 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:29:56.578717 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:29:56.578675 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:30:06.578077 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:30:06.577986 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:30:16.578714 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:30:16.578678 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:30:26.578711 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:30:26.578668 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:30:36.578895 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:30:36.578853 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:30:46.578094 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:30:46.578054 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:30:56.579042 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:30:56.579007 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:31:05.077359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.077308 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm"] Apr 16 22:31:05.078156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.078126 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" containerID="cri-o://6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3" gracePeriod=30 Apr 16 22:31:05.078297 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.078162 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kube-rbac-proxy" containerID="cri-o://3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f" gracePeriod=30 Apr 16 22:31:05.200063 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200034 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6"] Apr 16 22:31:05.200381 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200367 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kube-rbac-proxy" Apr 16 22:31:05.200433 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200383 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kube-rbac-proxy" Apr 16 22:31:05.200433 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200393 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" Apr 16 22:31:05.200433 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200398 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" Apr 16 22:31:05.200433 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200407 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="storage-initializer" Apr 16 22:31:05.200433 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200415 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="storage-initializer" Apr 16 22:31:05.200603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200488 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kserve-container" Apr 16 22:31:05.200603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.200499 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dff259ad-cd80-4128-ad22-6ba1c623a5b7" containerName="kube-rbac-proxy" Apr 16 22:31:05.203358 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.203344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.205615 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.205592 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 16 22:31:05.205746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.205618 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 16 22:31:05.212438 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.212416 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6"] Apr 16 22:31:05.231843 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.231814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.231946 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.231845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.231946 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.231919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlbbr\" (UniqueName: \"kubernetes.io/projected/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kube-api-access-nlbbr\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.232026 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.231974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.333001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.332919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.333001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.332973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.333001 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.332991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.333235 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.333035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlbbr\" (UniqueName: \"kubernetes.io/projected/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kube-api-access-nlbbr\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.333235 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:31:05.333143 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-serving-cert: secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 16 22:31:05.333235 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:31:05.333219 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls podName:1b4ab9ec-999c-42f4-88c5-ffedf94b7b18 nodeName:}" failed. No retries permitted until 2026-04-16 22:31:05.833195273 +0000 UTC m=+1044.957819542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls") pod "isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" (UID: "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18") : secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 16 22:31:05.333410 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.333363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.333618 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.333601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.342630 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.342604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlbbr\" (UniqueName: \"kubernetes.io/projected/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kube-api-access-nlbbr\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.830588 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.830557 2576 generic.go:358] "Generic (PLEG): container finished" podID="208c740f-f814-4669-b673-ca97f693d1f7" containerID="3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f" exitCode=2 Apr 16 22:31:05.830753 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.830624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerDied","Data":"3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f"} Apr 16 22:31:05.836055 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.836036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:05.838536 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:05.838504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:06.114687 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:06.114592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:31:06.239559 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:06.239438 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6"] Apr 16 22:31:06.241765 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:31:06.241737 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4ab9ec_999c_42f4_88c5_ffedf94b7b18.slice/crio-7ff98c69155d9434434b505462cf5e9f894b29b88700d0d8ea7018fc88640ab2 WatchSource:0}: Error finding container 7ff98c69155d9434434b505462cf5e9f894b29b88700d0d8ea7018fc88640ab2: Status 404 returned error can't find the container with id 7ff98c69155d9434434b505462cf5e9f894b29b88700d0d8ea7018fc88640ab2 Apr 16 22:31:06.243799 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:06.243781 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:31:06.573505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:06.573461 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 16 22:31:06.578846 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:06.578822 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 22:31:06.835427 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:06.835308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerStarted","Data":"22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6"} Apr 16 22:31:06.835427 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:06.835379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerStarted","Data":"7ff98c69155d9434434b505462cf5e9f894b29b88700d0d8ea7018fc88640ab2"} Apr 16 22:31:09.716749 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.716725 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:31:09.766922 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.766892 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/208c740f-f814-4669-b673-ca97f693d1f7-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"208c740f-f814-4669-b673-ca97f693d1f7\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " Apr 16 22:31:09.767075 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.766936 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/208c740f-f814-4669-b673-ca97f693d1f7-kserve-provision-location\") pod \"208c740f-f814-4669-b673-ca97f693d1f7\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " Apr 16 22:31:09.767075 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.767001 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f22wr\" (UniqueName: \"kubernetes.io/projected/208c740f-f814-4669-b673-ca97f693d1f7-kube-api-access-f22wr\") pod \"208c740f-f814-4669-b673-ca97f693d1f7\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " Apr 16 22:31:09.767075 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.767030 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/208c740f-f814-4669-b673-ca97f693d1f7-proxy-tls\") pod \"208c740f-f814-4669-b673-ca97f693d1f7\" (UID: \"208c740f-f814-4669-b673-ca97f693d1f7\") " Apr 16 22:31:09.767271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.767238 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/208c740f-f814-4669-b673-ca97f693d1f7-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "208c740f-f814-4669-b673-ca97f693d1f7" (UID: "208c740f-f814-4669-b673-ca97f693d1f7"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:31:09.767412 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.767362 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208c740f-f814-4669-b673-ca97f693d1f7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "208c740f-f814-4669-b673-ca97f693d1f7" (UID: "208c740f-f814-4669-b673-ca97f693d1f7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:09.768858 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.768836 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/208c740f-f814-4669-b673-ca97f693d1f7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "208c740f-f814-4669-b673-ca97f693d1f7" (UID: "208c740f-f814-4669-b673-ca97f693d1f7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:31:09.768999 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.768979 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208c740f-f814-4669-b673-ca97f693d1f7-kube-api-access-f22wr" (OuterVolumeSpecName: "kube-api-access-f22wr") pod "208c740f-f814-4669-b673-ca97f693d1f7" (UID: "208c740f-f814-4669-b673-ca97f693d1f7"). InnerVolumeSpecName "kube-api-access-f22wr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:31:09.848177 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.848093 2576 generic.go:358] "Generic (PLEG): container finished" podID="208c740f-f814-4669-b673-ca97f693d1f7" containerID="6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3" exitCode=0 Apr 16 22:31:09.848312 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.848176 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" Apr 16 22:31:09.848312 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.848176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerDied","Data":"6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3"} Apr 16 22:31:09.848312 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.848215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm" event={"ID":"208c740f-f814-4669-b673-ca97f693d1f7","Type":"ContainerDied","Data":"13533ddb1f5121e20c9d3546b03d93d3b7158775c423ccb6494883614a4101d6"} Apr 16 22:31:09.848312 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.848230 2576 scope.go:117] "RemoveContainer" containerID="3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f" Apr 16 22:31:09.856723 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.856705 2576 scope.go:117] "RemoveContainer" containerID="6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3" Apr 16 22:31:09.863727 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.863714 2576 scope.go:117] "RemoveContainer" containerID="5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9" Apr 16 22:31:09.868349 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.868309 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/208c740f-f814-4669-b673-ca97f693d1f7-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:31:09.868456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.868355 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/208c740f-f814-4669-b673-ca97f693d1f7-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:31:09.868456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.868370 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f22wr\" (UniqueName: \"kubernetes.io/projected/208c740f-f814-4669-b673-ca97f693d1f7-kube-api-access-f22wr\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:31:09.868456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.868384 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/208c740f-f814-4669-b673-ca97f693d1f7-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:31:09.868992 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.868976 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm"] Apr 16 22:31:09.870539 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.870519 2576 scope.go:117] "RemoveContainer" containerID="3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f" Apr 16 22:31:09.870847 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:31:09.870824 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f\": container with ID starting with 3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f not found: ID does not exist" containerID="3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f" Apr 16 22:31:09.870953 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.870858 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f"} err="failed to get container status \"3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f\": rpc error: code = NotFound desc = could not find container \"3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f\": container with ID starting with 3face0b01ad0e14c257f1d354acb020ad5f8a277b3f4d8ffb5120649a780f95f not found: ID does not exist" Apr 16 22:31:09.871175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.871156 2576 scope.go:117] "RemoveContainer" containerID="6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3" Apr 16 22:31:09.871460 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:31:09.871437 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3\": container with ID starting with 6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3 not found: ID does not exist" containerID="6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3" Apr 16 22:31:09.871547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.871468 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3"} err="failed to get container status \"6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3\": rpc error: code = NotFound desc = could not find container \"6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3\": container with ID starting with 6b9949ea87e33833b5fa4f2fd179a5b29cc2cca54709de87885900e3fc8fe4d3 not found: ID does not exist" Apr 16 22:31:09.871547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.871490 2576 scope.go:117] "RemoveContainer" containerID="5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9" Apr 16 22:31:09.871727 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:31:09.871709 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9\": container with ID starting with 5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9 not found: ID does not exist" containerID="5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9" Apr 16 22:31:09.871782 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.871741 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9"} err="failed to get container status \"5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9\": rpc error: code = NotFound desc = could not find container \"5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9\": container with ID starting with 5ab4a9738974ab3faa77972b30d58583cd673b92a8383944b64bf1b6b5b173d9 not found: ID does not exist" Apr 16 22:31:09.872659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:09.872639 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-28jjm"] Apr 16 22:31:10.853950 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:10.853866 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerID="22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6" exitCode=0 Apr 16 22:31:10.854420 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:10.853943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerDied","Data":"22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6"} Apr 16 22:31:11.456196 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:31:11.455732 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208c740f-f814-4669-b673-ca97f693d1f7" path="/var/lib/kubelet/pods/208c740f-f814-4669-b673-ca97f693d1f7/volumes" Apr 16 22:33:25.335674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:33:25.335636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerStarted","Data":"0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6"} Apr 16 22:33:25.335674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:33:25.335678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerStarted","Data":"713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c"} Apr 16 22:33:25.336096 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:33:25.335799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:33:25.336096 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:33:25.335830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:33:25.364913 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:33:25.364855 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" podStartSLOduration=6.682037717 podStartE2EDuration="2m20.36483732s" podCreationTimestamp="2026-04-16 22:31:05 +0000 UTC" firstStartedPulling="2026-04-16 22:31:10.855098742 +0000 UTC m=+1049.979723008" lastFinishedPulling="2026-04-16 22:33:24.537898332 +0000 UTC m=+1183.662522611" observedRunningTime="2026-04-16 22:33:25.363439187 +0000 UTC m=+1184.488063492" watchObservedRunningTime="2026-04-16 22:33:25.36483732 +0000 UTC m=+1184.489461606" Apr 16 22:33:31.344810 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:33:31.344779 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:34:01.349143 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:01.349113 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:34:05.334608 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.334572 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6"] Apr 16 22:34:05.335557 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.335522 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kserve-container" containerID="cri-o://713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c" gracePeriod=30 Apr 16 22:34:05.335710 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.335534 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kube-rbac-proxy" containerID="cri-o://0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6" gracePeriod=30 Apr 16 22:34:05.425665 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.425630 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq"] Apr 16 22:34:05.425947 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.425935 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kube-rbac-proxy" Apr 16 22:34:05.425991 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.425950 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kube-rbac-proxy" Apr 16 22:34:05.425991 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.425959 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="storage-initializer" Apr 16 22:34:05.425991 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.425965 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="storage-initializer" Apr 16 22:34:05.425991 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.425974 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" Apr 16 22:34:05.425991 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.425980 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" Apr 16 22:34:05.426140 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.426024 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kserve-container" Apr 16 22:34:05.426140 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.426033 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="208c740f-f814-4669-b673-ca97f693d1f7" containerName="kube-rbac-proxy" Apr 16 22:34:05.429999 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.429977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.432172 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.432149 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 16 22:34:05.432291 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.432173 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 16 22:34:05.436182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.435887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dnfj\" (UniqueName: \"kubernetes.io/projected/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kube-api-access-5dnfj\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.436182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.435928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.436182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.435972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.436182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.436129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.438776 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.438757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq"] Apr 16 22:34:05.463651 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.463613 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerID="0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6" exitCode=2 Apr 16 22:34:05.463774 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.463688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerDied","Data":"0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6"} Apr 16 22:34:05.536878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.536841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.536878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.536880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dnfj\" (UniqueName: \"kubernetes.io/projected/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kube-api-access-5dnfj\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.537147 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.537003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.537147 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.537059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.537255 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:05.537209 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 16 22:34:05.537296 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:05.537278 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls podName:edfa2a5e-112c-4fb6-95ae-9235b217a5ac nodeName:}" failed. No retries permitted until 2026-04-16 22:34:06.037257996 +0000 UTC m=+1225.161882262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" (UID: "edfa2a5e-112c-4fb6-95ae-9235b217a5ac") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 16 22:34:05.537409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.537386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.537567 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.537550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:05.546057 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:05.546034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dnfj\" (UniqueName: \"kubernetes.io/projected/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kube-api-access-5dnfj\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:06.040267 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.040217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:06.043247 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.043216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:06.341832 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.341803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:06.384695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.384667 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:34:06.444930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.444898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kserve-provision-location\") pod \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " Apr 16 22:34:06.444930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.444933 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls\") pod \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " Apr 16 22:34:06.445158 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.444986 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " Apr 16 22:34:06.445158 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.445006 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlbbr\" (UniqueName: \"kubernetes.io/projected/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kube-api-access-nlbbr\") pod \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\" (UID: \"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18\") " Apr 16 22:34:06.445340 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.445297 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" (UID: "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:06.445408 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.445382 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" (UID: "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:34:06.448524 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.448066 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kube-api-access-nlbbr" (OuterVolumeSpecName: "kube-api-access-nlbbr") pod "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" (UID: "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18"). InnerVolumeSpecName "kube-api-access-nlbbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:34:06.451315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.451264 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" (UID: "1b4ab9ec-999c-42f4-88c5-ffedf94b7b18"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:06.468359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.468316 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerID="713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c" exitCode=0 Apr 16 22:34:06.468480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.468382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerDied","Data":"713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c"} Apr 16 22:34:06.468480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.468415 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" event={"ID":"1b4ab9ec-999c-42f4-88c5-ffedf94b7b18","Type":"ContainerDied","Data":"7ff98c69155d9434434b505462cf5e9f894b29b88700d0d8ea7018fc88640ab2"} Apr 16 22:34:06.468480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.468431 2576 scope.go:117] "RemoveContainer" containerID="0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6" Apr 16 22:34:06.468480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.468448 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" Apr 16 22:34:06.475701 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.475683 2576 scope.go:117] "RemoveContainer" containerID="713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c" Apr 16 22:34:06.481349 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.481314 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq"] Apr 16 22:34:06.483682 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.483664 2576 scope.go:117] "RemoveContainer" containerID="22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6" Apr 16 22:34:06.485879 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:34:06.485856 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedfa2a5e_112c_4fb6_95ae_9235b217a5ac.slice/crio-8b2de17420aeae4a838eaa38fc59618d6fc562d42996271483d57eac3aed6a51 WatchSource:0}: Error finding container 8b2de17420aeae4a838eaa38fc59618d6fc562d42996271483d57eac3aed6a51: Status 404 returned error can't find the container with id 8b2de17420aeae4a838eaa38fc59618d6fc562d42996271483d57eac3aed6a51 Apr 16 22:34:06.489868 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.489848 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6"] Apr 16 22:34:06.491231 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.491217 2576 scope.go:117] "RemoveContainer" containerID="0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6" Apr 16 22:34:06.491580 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:06.491553 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6\": container with ID starting with 0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6 not found: ID does not exist" containerID="0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6" Apr 16 22:34:06.491716 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.491589 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6"} err="failed to get container status \"0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6\": rpc error: code = NotFound desc = could not find container \"0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6\": container with ID starting with 0ac69c645ad4419dfb4a436d8c03afea1d52b2280616c9db84ebf8561b4186a6 not found: ID does not exist" Apr 16 22:34:06.491716 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.491614 2576 scope.go:117] "RemoveContainer" containerID="713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c" Apr 16 22:34:06.491950 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:06.491914 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c\": container with ID starting with 713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c not found: ID does not exist" containerID="713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c" Apr 16 22:34:06.492038 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.491945 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c"} err="failed to get container status \"713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c\": rpc error: code = NotFound desc = could not find container \"713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c\": container with ID starting with 713dd6fecf05fef57e992d4e6f4b726c50591fbdebff3faf99b79fddb79e057c not found: ID does not exist" Apr 16 22:34:06.492038 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.491986 2576 scope.go:117] "RemoveContainer" containerID="22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6" Apr 16 22:34:06.492359 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:06.492285 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6\": container with ID starting with 22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6 not found: ID does not exist" containerID="22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6" Apr 16 22:34:06.492359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.492317 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6"} err="failed to get container status \"22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6\": rpc error: code = NotFound desc = could not find container \"22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6\": container with ID starting with 22544aa10a7633d1140bef6c82133fdfc01b9c1fa95f657c4ef8e5b2fa7646b6 not found: ID does not exist" Apr 16 22:34:06.493710 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.493645 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6"] Apr 16 22:34:06.546500 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.546480 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:06.546587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.546502 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:06.546587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.546514 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:06.546587 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:06.546524 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlbbr\" (UniqueName: \"kubernetes.io/projected/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18-kube-api-access-nlbbr\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:07.340429 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:07.340385 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-gmjt6" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": context deadline exceeded" Apr 16 22:34:07.453342 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:07.453295 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" path="/var/lib/kubelet/pods/1b4ab9ec-999c-42f4-88c5-ffedf94b7b18/volumes" Apr 16 22:34:07.473502 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:07.473474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerStarted","Data":"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460"} Apr 16 22:34:07.473634 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:07.473506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerStarted","Data":"8b2de17420aeae4a838eaa38fc59618d6fc562d42996271483d57eac3aed6a51"} Apr 16 22:34:10.484717 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:10.484682 2576 generic.go:358] "Generic (PLEG): container finished" podID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerID="cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460" exitCode=0 Apr 16 22:34:10.485114 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:10.484759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerDied","Data":"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460"} Apr 16 22:34:11.489569 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:11.489535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerStarted","Data":"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f"} Apr 16 22:34:11.489959 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:11.489576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerStarted","Data":"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116"} Apr 16 22:34:11.489959 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:11.489776 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:11.508774 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:11.508728 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" podStartSLOduration=6.508713334 podStartE2EDuration="6.508713334s" podCreationTimestamp="2026-04-16 22:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:11.508045496 +0000 UTC m=+1230.632669786" watchObservedRunningTime="2026-04-16 22:34:11.508713334 +0000 UTC m=+1230.633337622" Apr 16 22:34:12.493227 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:12.493189 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:12.494320 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:12.494292 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 22:34:13.496200 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:13.496159 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 22:34:18.501903 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:18.501877 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:18.503057 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:18.503039 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:25.473440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.473407 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq"] Apr 16 22:34:25.473997 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.473762 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kserve-container" containerID="cri-o://c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116" gracePeriod=30 Apr 16 22:34:25.473997 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.473816 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kube-rbac-proxy" containerID="cri-o://5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f" gracePeriod=30 Apr 16 22:34:25.540622 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.540593 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn"] Apr 16 22:34:25.540979 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.540962 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kube-rbac-proxy" Apr 16 22:34:25.541061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.540982 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kube-rbac-proxy" Apr 16 22:34:25.541061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.540996 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kserve-container" Apr 16 22:34:25.541061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.541005 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kserve-container" Apr 16 22:34:25.541061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.541019 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="storage-initializer" Apr 16 22:34:25.541061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.541028 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="storage-initializer" Apr 16 22:34:25.541315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.541106 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kube-rbac-proxy" Apr 16 22:34:25.541315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.541120 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b4ab9ec-999c-42f4-88c5-ffedf94b7b18" containerName="kserve-container" Apr 16 22:34:25.546532 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.544912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.548181 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.548155 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 16 22:34:25.548302 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.548157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 16 22:34:25.555792 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.555766 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn"] Apr 16 22:34:25.596652 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.596619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e079000b-d887-4175-8879-b0613e71062d-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.596652 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.596653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gj4\" (UniqueName: \"kubernetes.io/projected/e079000b-d887-4175-8879-b0613e71062d-kube-api-access-b5gj4\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.596827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.596740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e079000b-d887-4175-8879-b0613e71062d-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.596867 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.596847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.697868 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.697833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e079000b-d887-4175-8879-b0613e71062d-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.698050 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.697973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.698050 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.698000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e079000b-d887-4175-8879-b0613e71062d-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.698050 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.698026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gj4\" (UniqueName: \"kubernetes.io/projected/e079000b-d887-4175-8879-b0613e71062d-kube-api-access-b5gj4\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.698284 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:25.698096 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 16 22:34:25.698284 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:25.698168 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls podName:e079000b-d887-4175-8879-b0613e71062d nodeName:}" failed. No retries permitted until 2026-04-16 22:34:26.198153733 +0000 UTC m=+1245.322777998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" (UID: "e079000b-d887-4175-8879-b0613e71062d") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 16 22:34:25.698432 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.698265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e079000b-d887-4175-8879-b0613e71062d-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.698816 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.698794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e079000b-d887-4175-8879-b0613e71062d-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:25.706749 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:25.706727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gj4\" (UniqueName: \"kubernetes.io/projected/e079000b-d887-4175-8879-b0613e71062d-kube-api-access-b5gj4\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:26.203719 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.203686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:26.206125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.206100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:26.216399 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.216372 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:26.304222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.304139 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dnfj\" (UniqueName: \"kubernetes.io/projected/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kube-api-access-5dnfj\") pod \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " Apr 16 22:34:26.304222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.304186 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls\") pod \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " Apr 16 22:34:26.304489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.304234 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kserve-provision-location\") pod \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " Apr 16 22:34:26.304489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.304252 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\" (UID: \"edfa2a5e-112c-4fb6-95ae-9235b217a5ac\") " Apr 16 22:34:26.304618 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.304592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "edfa2a5e-112c-4fb6-95ae-9235b217a5ac" (UID: "edfa2a5e-112c-4fb6-95ae-9235b217a5ac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:26.304706 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.304670 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "edfa2a5e-112c-4fb6-95ae-9235b217a5ac" (UID: "edfa2a5e-112c-4fb6-95ae-9235b217a5ac"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:34:26.306308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.306285 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "edfa2a5e-112c-4fb6-95ae-9235b217a5ac" (UID: "edfa2a5e-112c-4fb6-95ae-9235b217a5ac"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:26.306423 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.306317 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kube-api-access-5dnfj" (OuterVolumeSpecName: "kube-api-access-5dnfj") pod "edfa2a5e-112c-4fb6-95ae-9235b217a5ac" (UID: "edfa2a5e-112c-4fb6-95ae-9235b217a5ac"). InnerVolumeSpecName "kube-api-access-5dnfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:34:26.405692 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.405655 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:26.405692 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.405686 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:26.405692 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.405697 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dnfj\" (UniqueName: \"kubernetes.io/projected/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-kube-api-access-5dnfj\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:26.405910 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.405706 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfa2a5e-112c-4fb6-95ae-9235b217a5ac-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:34:26.458904 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.458875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:26.540240 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.540211 2576 generic.go:358] "Generic (PLEG): container finished" podID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerID="5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f" exitCode=2 Apr 16 22:34:26.540240 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.540236 2576 generic.go:358] "Generic (PLEG): container finished" podID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerID="c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116" exitCode=0 Apr 16 22:34:26.540721 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.540294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerDied","Data":"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f"} Apr 16 22:34:26.540721 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.540317 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" Apr 16 22:34:26.540721 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.540361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerDied","Data":"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116"} Apr 16 22:34:26.540721 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.540381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq" event={"ID":"edfa2a5e-112c-4fb6-95ae-9235b217a5ac","Type":"ContainerDied","Data":"8b2de17420aeae4a838eaa38fc59618d6fc562d42996271483d57eac3aed6a51"} Apr 16 22:34:26.540721 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.540402 2576 scope.go:117] "RemoveContainer" containerID="5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f" Apr 16 22:34:26.550048 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.550031 2576 scope.go:117] "RemoveContainer" containerID="c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116" Apr 16 22:34:26.561213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.561175 2576 scope.go:117] "RemoveContainer" containerID="cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460" Apr 16 22:34:26.564367 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.564345 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq"] Apr 16 22:34:26.568861 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.568836 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-vdclq"] Apr 16 22:34:26.569196 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.569183 2576 scope.go:117] "RemoveContainer" containerID="5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f" Apr 16 22:34:26.569478 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:26.569460 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f\": container with ID starting with 5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f not found: ID does not exist" containerID="5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f" Apr 16 22:34:26.569534 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.569485 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f"} err="failed to get container status \"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f\": rpc error: code = NotFound desc = could not find container \"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f\": container with ID starting with 5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f not found: ID does not exist" Apr 16 22:34:26.569534 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.569503 2576 scope.go:117] "RemoveContainer" containerID="c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116" Apr 16 22:34:26.569751 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:26.569721 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116\": container with ID starting with c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116 not found: ID does not exist" containerID="c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116" Apr 16 22:34:26.569878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.569750 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116"} err="failed to get container status \"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116\": rpc error: code = NotFound desc = could not find container \"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116\": container with ID starting with c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116 not found: ID does not exist" Apr 16 22:34:26.569878 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.569773 2576 scope.go:117] "RemoveContainer" containerID="cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460" Apr 16 22:34:26.570014 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:34:26.569996 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460\": container with ID starting with cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460 not found: ID does not exist" containerID="cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460" Apr 16 22:34:26.570059 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.570020 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460"} err="failed to get container status \"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460\": rpc error: code = NotFound desc = could not find container \"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460\": container with ID starting with cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460 not found: ID does not exist" Apr 16 22:34:26.570059 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.570035 2576 scope.go:117] "RemoveContainer" containerID="5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f" Apr 16 22:34:26.570250 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.570233 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f"} err="failed to get container status \"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f\": rpc error: code = NotFound desc = could not find container \"5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f\": container with ID starting with 5ab45f6856edc841f2eedef4cbce25946b738eb654e9e3045e121f496235da0f not found: ID does not exist" Apr 16 22:34:26.570294 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.570250 2576 scope.go:117] "RemoveContainer" containerID="c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116" Apr 16 22:34:26.570528 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.570506 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116"} err="failed to get container status \"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116\": rpc error: code = NotFound desc = could not find container \"c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116\": container with ID starting with c2e898d2473c7c2e3e2f0f18d7c29d9ccc170c84432d906bb883a9f2b827f116 not found: ID does not exist" Apr 16 22:34:26.570528 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.570526 2576 scope.go:117] "RemoveContainer" containerID="cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460" Apr 16 22:34:26.570765 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.570729 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460"} err="failed to get container status \"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460\": rpc error: code = NotFound desc = could not find container \"cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460\": container with ID starting with cf6eafa83a524d3d9579225e1e77ab8e3ac6bc9253bd9ce5b0cf88ae7e4c6460 not found: ID does not exist" Apr 16 22:34:26.584865 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:26.584835 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn"] Apr 16 22:34:26.587559 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:34:26.587523 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode079000b_d887_4175_8879_b0613e71062d.slice/crio-9315c0669e741c884350eca024045f3f18261e4438bdc7a2863dfcd5ad6befa1 WatchSource:0}: Error finding container 9315c0669e741c884350eca024045f3f18261e4438bdc7a2863dfcd5ad6befa1: Status 404 returned error can't find the container with id 9315c0669e741c884350eca024045f3f18261e4438bdc7a2863dfcd5ad6befa1 Apr 16 22:34:27.453111 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:27.453078 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" path="/var/lib/kubelet/pods/edfa2a5e-112c-4fb6-95ae-9235b217a5ac/volumes" Apr 16 22:34:27.545135 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:27.545100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerStarted","Data":"70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162"} Apr 16 22:34:27.545135 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:27.545140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerStarted","Data":"9315c0669e741c884350eca024045f3f18261e4438bdc7a2863dfcd5ad6befa1"} Apr 16 22:34:30.557119 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:30.557023 2576 generic.go:358] "Generic (PLEG): container finished" podID="e079000b-d887-4175-8879-b0613e71062d" containerID="70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162" exitCode=0 Apr 16 22:34:30.557119 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:30.557100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerDied","Data":"70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162"} Apr 16 22:34:31.563342 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:31.563302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerStarted","Data":"6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a"} Apr 16 22:34:31.563737 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:31.563361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerStarted","Data":"ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156"} Apr 16 22:34:31.563737 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:31.563562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:31.582012 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:31.581967 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" podStartSLOduration=6.581951297 podStartE2EDuration="6.581951297s" podCreationTimestamp="2026-04-16 22:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:31.580162243 +0000 UTC m=+1250.704786530" watchObservedRunningTime="2026-04-16 22:34:31.581951297 +0000 UTC m=+1250.706575584" Apr 16 22:34:32.567193 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:32.567156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:34:38.574890 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:34:38.574864 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:35:08.578772 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:08.578738 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:35:15.614532 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.614490 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn"] Apr 16 22:35:15.615061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.614836 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kserve-container" containerID="cri-o://ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156" gracePeriod=30 Apr 16 22:35:15.615061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.614878 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kube-rbac-proxy" containerID="cri-o://6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a" gracePeriod=30 Apr 16 22:35:15.699427 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699395 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn"] Apr 16 22:35:15.699777 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699759 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="storage-initializer" Apr 16 22:35:15.699859 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699780 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="storage-initializer" Apr 16 22:35:15.699859 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699812 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kserve-container" Apr 16 22:35:15.699859 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699821 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kserve-container" Apr 16 22:35:15.699859 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699833 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kube-rbac-proxy" Apr 16 22:35:15.699859 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699842 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kube-rbac-proxy" Apr 16 22:35:15.700109 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699925 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kserve-container" Apr 16 22:35:15.700109 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.699941 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="edfa2a5e-112c-4fb6-95ae-9235b217a5ac" containerName="kube-rbac-proxy" Apr 16 22:35:15.703339 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.703304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.705473 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.705449 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 16 22:35:15.705591 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.705471 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 16 22:35:15.714897 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.714874 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn"] Apr 16 22:35:15.791531 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.791494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f34ef695-31ba-4e6c-9b6c-f1825e691637-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.791734 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.791541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f34ef695-31ba-4e6c-9b6c-f1825e691637-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.791734 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.791615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvtf6\" (UniqueName: \"kubernetes.io/projected/f34ef695-31ba-4e6c-9b6c-f1825e691637-kube-api-access-dvtf6\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.791734 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.791656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.892994 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.892876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f34ef695-31ba-4e6c-9b6c-f1825e691637-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.893186 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.893037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f34ef695-31ba-4e6c-9b6c-f1825e691637-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.893186 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.893119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvtf6\" (UniqueName: \"kubernetes.io/projected/f34ef695-31ba-4e6c-9b6c-f1825e691637-kube-api-access-dvtf6\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.893186 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.893159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.893386 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:35:15.893279 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-serving-cert: secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 16 22:35:15.893386 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:35:15.893375 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls podName:f34ef695-31ba-4e6c-9b6c-f1825e691637 nodeName:}" failed. No retries permitted until 2026-04-16 22:35:16.393351992 +0000 UTC m=+1295.517976260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls") pod "isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" (UID: "f34ef695-31ba-4e6c-9b6c-f1825e691637") : secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 16 22:35:15.893516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.893490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f34ef695-31ba-4e6c-9b6c-f1825e691637-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.893793 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.893771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f34ef695-31ba-4e6c-9b6c-f1825e691637-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:15.901931 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:15.901907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvtf6\" (UniqueName: \"kubernetes.io/projected/f34ef695-31ba-4e6c-9b6c-f1825e691637-kube-api-access-dvtf6\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:16.397856 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.397814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:16.400333 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.400293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:16.617043 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.617004 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:16.718470 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.718428 2576 generic.go:358] "Generic (PLEG): container finished" podID="e079000b-d887-4175-8879-b0613e71062d" containerID="6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a" exitCode=2 Apr 16 22:35:16.718639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.718497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerDied","Data":"6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a"} Apr 16 22:35:16.750190 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.750076 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn"] Apr 16 22:35:16.753024 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:35:16.752990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34ef695_31ba_4e6c_9b6c_f1825e691637.slice/crio-0c2ac49f38500487167914e3ef7f60b137df2599ec80e5facacfe3e4ef89f6cc WatchSource:0}: Error finding container 0c2ac49f38500487167914e3ef7f60b137df2599ec80e5facacfe3e4ef89f6cc: Status 404 returned error can't find the container with id 0c2ac49f38500487167914e3ef7f60b137df2599ec80e5facacfe3e4ef89f6cc Apr 16 22:35:16.852874 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.852856 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:35:16.902466 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.902436 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5gj4\" (UniqueName: \"kubernetes.io/projected/e079000b-d887-4175-8879-b0613e71062d-kube-api-access-b5gj4\") pod \"e079000b-d887-4175-8879-b0613e71062d\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " Apr 16 22:35:16.902466 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.902477 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e079000b-d887-4175-8879-b0613e71062d-kserve-provision-location\") pod \"e079000b-d887-4175-8879-b0613e71062d\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " Apr 16 22:35:16.902719 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.902556 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls\") pod \"e079000b-d887-4175-8879-b0613e71062d\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " Apr 16 22:35:16.902719 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.902598 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e079000b-d887-4175-8879-b0613e71062d-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"e079000b-d887-4175-8879-b0613e71062d\" (UID: \"e079000b-d887-4175-8879-b0613e71062d\") " Apr 16 22:35:16.903111 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.903054 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e079000b-d887-4175-8879-b0613e71062d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e079000b-d887-4175-8879-b0613e71062d" (UID: "e079000b-d887-4175-8879-b0613e71062d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:35:16.903111 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.903071 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e079000b-d887-4175-8879-b0613e71062d-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "e079000b-d887-4175-8879-b0613e71062d" (UID: "e079000b-d887-4175-8879-b0613e71062d"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:35:16.905144 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.905117 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e079000b-d887-4175-8879-b0613e71062d" (UID: "e079000b-d887-4175-8879-b0613e71062d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:35:16.905217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:16.905150 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e079000b-d887-4175-8879-b0613e71062d-kube-api-access-b5gj4" (OuterVolumeSpecName: "kube-api-access-b5gj4") pod "e079000b-d887-4175-8879-b0613e71062d" (UID: "e079000b-d887-4175-8879-b0613e71062d"). InnerVolumeSpecName "kube-api-access-b5gj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:35:17.003836 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.003799 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b5gj4\" (UniqueName: \"kubernetes.io/projected/e079000b-d887-4175-8879-b0613e71062d-kube-api-access-b5gj4\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:35:17.003981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.003839 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e079000b-d887-4175-8879-b0613e71062d-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:35:17.003981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.003854 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e079000b-d887-4175-8879-b0613e71062d-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:35:17.003981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.003871 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e079000b-d887-4175-8879-b0613e71062d-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:35:17.723361 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.723248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerStarted","Data":"38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b"} Apr 16 22:35:17.723361 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.723289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerStarted","Data":"0c2ac49f38500487167914e3ef7f60b137df2599ec80e5facacfe3e4ef89f6cc"} Apr 16 22:35:17.724793 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.724768 2576 generic.go:358] "Generic (PLEG): container finished" podID="e079000b-d887-4175-8879-b0613e71062d" containerID="ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156" exitCode=0 Apr 16 22:35:17.724867 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.724818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerDied","Data":"ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156"} Apr 16 22:35:17.724867 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.724843 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" Apr 16 22:35:17.724867 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.724859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn" event={"ID":"e079000b-d887-4175-8879-b0613e71062d","Type":"ContainerDied","Data":"9315c0669e741c884350eca024045f3f18261e4438bdc7a2863dfcd5ad6befa1"} Apr 16 22:35:17.724970 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.724879 2576 scope.go:117] "RemoveContainer" containerID="6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a" Apr 16 22:35:17.734764 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.734737 2576 scope.go:117] "RemoveContainer" containerID="ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156" Apr 16 22:35:17.742050 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.742029 2576 scope.go:117] "RemoveContainer" containerID="70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162" Apr 16 22:35:17.750526 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.750505 2576 scope.go:117] "RemoveContainer" containerID="6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a" Apr 16 22:35:17.750912 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:35:17.750808 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a\": container with ID starting with 6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a not found: ID does not exist" containerID="6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a" Apr 16 22:35:17.750912 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.750848 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a"} err="failed to get container status \"6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a\": rpc error: code = NotFound desc = could not find container \"6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a\": container with ID starting with 6f5c69c110d0efc9ace870d2dd52e09db7bdddb12f39244c9650e49ae73d392a not found: ID does not exist" Apr 16 22:35:17.750912 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.750864 2576 scope.go:117] "RemoveContainer" containerID="ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156" Apr 16 22:35:17.751150 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:35:17.751125 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156\": container with ID starting with ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156 not found: ID does not exist" containerID="ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156" Apr 16 22:35:17.751209 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.751157 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156"} err="failed to get container status \"ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156\": rpc error: code = NotFound desc = could not find container \"ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156\": container with ID starting with ecd47f58a1adfdb8052500f2357a8c1e7bab6ac1bca7373ec62dcfb79aead156 not found: ID does not exist" Apr 16 22:35:17.751209 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.751179 2576 scope.go:117] "RemoveContainer" containerID="70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162" Apr 16 22:35:17.751454 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:35:17.751434 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162\": container with ID starting with 70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162 not found: ID does not exist" containerID="70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162" Apr 16 22:35:17.751521 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.751460 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162"} err="failed to get container status \"70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162\": rpc error: code = NotFound desc = could not find container \"70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162\": container with ID starting with 70d44ceab1d202c4e8db215d6da519531698f0e10c56b65da898485ea61ba162 not found: ID does not exist" Apr 16 22:35:17.755523 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.755497 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn"] Apr 16 22:35:17.756783 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:17.756761 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-7l7dn"] Apr 16 22:35:19.453434 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:19.453402 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e079000b-d887-4175-8879-b0613e71062d" path="/var/lib/kubelet/pods/e079000b-d887-4175-8879-b0613e71062d/volumes" Apr 16 22:35:20.737663 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:20.737628 2576 generic.go:358] "Generic (PLEG): container finished" podID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerID="38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b" exitCode=0 Apr 16 22:35:20.738024 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:20.737692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerDied","Data":"38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b"} Apr 16 22:35:21.743569 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:21.743535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerStarted","Data":"27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2"} Apr 16 22:35:23.754507 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:23.754473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerStarted","Data":"e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837"} Apr 16 22:35:23.754507 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:23.754508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerStarted","Data":"f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade"} Apr 16 22:35:23.754959 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:23.754604 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:23.754959 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:23.754643 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:23.754959 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:23.754771 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:23.775893 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:23.775849 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podStartSLOduration=6.531401976 podStartE2EDuration="8.775834557s" podCreationTimestamp="2026-04-16 22:35:15 +0000 UTC" firstStartedPulling="2026-04-16 22:35:20.799062642 +0000 UTC m=+1299.923686907" lastFinishedPulling="2026-04-16 22:35:23.043495217 +0000 UTC m=+1302.168119488" observedRunningTime="2026-04-16 22:35:23.774227619 +0000 UTC m=+1302.898851905" watchObservedRunningTime="2026-04-16 22:35:23.775834557 +0000 UTC m=+1302.900458844" Apr 16 22:35:29.763416 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:29.763385 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:35:59.765883 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:35:59.765795 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:36:29.766611 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:29.766577 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:36:35.761564 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.761526 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn"] Apr 16 22:36:35.762180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.761984 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-container" containerID="cri-o://27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2" gracePeriod=30 Apr 16 22:36:35.762180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.762058 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" containerID="cri-o://e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837" gracePeriod=30 Apr 16 22:36:35.762180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.762126 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-agent" containerID="cri-o://f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade" gracePeriod=30 Apr 16 22:36:35.837630 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.837593 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29"] Apr 16 22:36:35.837941 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.837927 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="storage-initializer" Apr 16 22:36:35.837990 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.837943 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="storage-initializer" Apr 16 22:36:35.837990 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.837953 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kube-rbac-proxy" Apr 16 22:36:35.837990 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.837959 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kube-rbac-proxy" Apr 16 22:36:35.837990 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.837979 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kserve-container" Apr 16 22:36:35.837990 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.837985 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kserve-container" Apr 16 22:36:35.838150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.838029 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kube-rbac-proxy" Apr 16 22:36:35.838150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.838037 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e079000b-d887-4175-8879-b0613e71062d" containerName="kserve-container" Apr 16 22:36:35.841405 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.841382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:35.843800 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.843771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 16 22:36:35.843918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.843771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 16 22:36:35.849580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.849555 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29"] Apr 16 22:36:35.934635 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.934596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7215289-752a-494b-baf5-dfb0f543b937-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:35.934786 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.934641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7215289-752a-494b-baf5-dfb0f543b937-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:35.934786 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.934748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhtc\" (UniqueName: \"kubernetes.io/projected/c7215289-752a-494b-baf5-dfb0f543b937-kube-api-access-ndhtc\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:35.934861 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.934789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7215289-752a-494b-baf5-dfb0f543b937-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:35.989661 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.989630 2576 generic.go:358] "Generic (PLEG): container finished" podID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerID="e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837" exitCode=2 Apr 16 22:36:35.989836 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:35.989702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerDied","Data":"e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837"} Apr 16 22:36:36.036120 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.036029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7215289-752a-494b-baf5-dfb0f543b937-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.036120 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.036068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7215289-752a-494b-baf5-dfb0f543b937-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.036120 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.036112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhtc\" (UniqueName: \"kubernetes.io/projected/c7215289-752a-494b-baf5-dfb0f543b937-kube-api-access-ndhtc\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.036424 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.036233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7215289-752a-494b-baf5-dfb0f543b937-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.036651 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.036631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7215289-752a-494b-baf5-dfb0f543b937-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.036918 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.036893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7215289-752a-494b-baf5-dfb0f543b937-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.038763 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.038738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7215289-752a-494b-baf5-dfb0f543b937-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.045996 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.045962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhtc\" (UniqueName: \"kubernetes.io/projected/c7215289-752a-494b-baf5-dfb0f543b937-kube-api-access-ndhtc\") pod \"isvc-paddle-predictor-6b8b7cfb4b-6hx29\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.152919 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.152884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:36.277691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.277632 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29"] Apr 16 22:36:36.279970 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:36:36.279937 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7215289_752a_494b_baf5_dfb0f543b937.slice/crio-7074827d10a6c7a7b33331733bf8e7454b38bf7e433933a287fc7e7fa78ed3bc WatchSource:0}: Error finding container 7074827d10a6c7a7b33331733bf8e7454b38bf7e433933a287fc7e7fa78ed3bc: Status 404 returned error can't find the container with id 7074827d10a6c7a7b33331733bf8e7454b38bf7e433933a287fc7e7fa78ed3bc Apr 16 22:36:36.281844 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.281826 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:36:36.994128 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.994087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerStarted","Data":"ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6"} Apr 16 22:36:36.994128 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:36.994130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerStarted","Data":"7074827d10a6c7a7b33331733bf8e7454b38bf7e433933a287fc7e7fa78ed3bc"} Apr 16 22:36:39.003698 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:39.003664 2576 generic.go:358] "Generic (PLEG): container finished" podID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerID="27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2" exitCode=0 Apr 16 22:36:39.004094 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:39.003735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerDied","Data":"27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2"} Apr 16 22:36:39.758693 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:39.758645 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 16 22:36:39.764628 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:39.764592 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.37:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 22:36:43.019023 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:43.018989 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7215289-752a-494b-baf5-dfb0f543b937" containerID="ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6" exitCode=0 Apr 16 22:36:43.019487 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:43.019052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerDied","Data":"ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6"} Apr 16 22:36:44.759179 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:44.759114 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 16 22:36:49.758314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:49.758254 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 16 22:36:49.758767 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:49.758430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:36:49.764132 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:49.764083 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.37:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 22:36:54.758611 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:54.758558 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 16 22:36:55.061717 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:55.061622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerStarted","Data":"c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9"} Apr 16 22:36:55.061717 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:55.061670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerStarted","Data":"d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea"} Apr 16 22:36:55.061906 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:55.061883 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:55.080569 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:55.080518 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podStartSLOduration=8.670648471 podStartE2EDuration="20.080500637s" podCreationTimestamp="2026-04-16 22:36:35 +0000 UTC" firstStartedPulling="2026-04-16 22:36:43.020263016 +0000 UTC m=+1382.144887282" lastFinishedPulling="2026-04-16 22:36:54.430115168 +0000 UTC m=+1393.554739448" observedRunningTime="2026-04-16 22:36:55.079410213 +0000 UTC m=+1394.204034501" watchObservedRunningTime="2026-04-16 22:36:55.080500637 +0000 UTC m=+1394.205124924" Apr 16 22:36:56.065380 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:56.065351 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:36:56.066666 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:56.066643 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 22:36:57.068635 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:57.068591 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 22:36:59.758694 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:59.758648 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 16 22:36:59.764148 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:59.764109 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.37:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 22:36:59.764282 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:36:59.764246 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:37:02.073492 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:02.073459 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:37:02.074005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:02.073979 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 22:37:04.758813 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:04.758771 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 16 22:37:05.955917 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:05.955893 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:37:06.097426 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.097313 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f34ef695-31ba-4e6c-9b6c-f1825e691637-kserve-provision-location\") pod \"f34ef695-31ba-4e6c-9b6c-f1825e691637\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " Apr 16 22:37:06.097598 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.097435 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls\") pod \"f34ef695-31ba-4e6c-9b6c-f1825e691637\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " Apr 16 22:37:06.097598 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.097522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f34ef695-31ba-4e6c-9b6c-f1825e691637-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"f34ef695-31ba-4e6c-9b6c-f1825e691637\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " Apr 16 22:37:06.097598 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.097568 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvtf6\" (UniqueName: \"kubernetes.io/projected/f34ef695-31ba-4e6c-9b6c-f1825e691637-kube-api-access-dvtf6\") pod \"f34ef695-31ba-4e6c-9b6c-f1825e691637\" (UID: \"f34ef695-31ba-4e6c-9b6c-f1825e691637\") " Apr 16 22:37:06.097784 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.097680 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f34ef695-31ba-4e6c-9b6c-f1825e691637-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f34ef695-31ba-4e6c-9b6c-f1825e691637" (UID: "f34ef695-31ba-4e6c-9b6c-f1825e691637"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:06.097899 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.097876 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f34ef695-31ba-4e6c-9b6c-f1825e691637-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:06.098041 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.097952 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34ef695-31ba-4e6c-9b6c-f1825e691637-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "f34ef695-31ba-4e6c-9b6c-f1825e691637" (UID: "f34ef695-31ba-4e6c-9b6c-f1825e691637"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:37:06.099146 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.099117 2576 generic.go:358] "Generic (PLEG): container finished" podID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerID="f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade" exitCode=137 Apr 16 22:37:06.099259 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.099203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerDied","Data":"f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade"} Apr 16 22:37:06.099259 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.099251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" event={"ID":"f34ef695-31ba-4e6c-9b6c-f1825e691637","Type":"ContainerDied","Data":"0c2ac49f38500487167914e3ef7f60b137df2599ec80e5facacfe3e4ef89f6cc"} Apr 16 22:37:06.099393 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.099272 2576 scope.go:117] "RemoveContainer" containerID="e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837" Apr 16 22:37:06.099393 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.099279 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn" Apr 16 22:37:06.100098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.100070 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f34ef695-31ba-4e6c-9b6c-f1825e691637" (UID: "f34ef695-31ba-4e6c-9b6c-f1825e691637"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:37:06.100194 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.100076 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34ef695-31ba-4e6c-9b6c-f1825e691637-kube-api-access-dvtf6" (OuterVolumeSpecName: "kube-api-access-dvtf6") pod "f34ef695-31ba-4e6c-9b6c-f1825e691637" (UID: "f34ef695-31ba-4e6c-9b6c-f1825e691637"). InnerVolumeSpecName "kube-api-access-dvtf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:37:06.116531 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.116512 2576 scope.go:117] "RemoveContainer" containerID="f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade" Apr 16 22:37:06.124371 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.124351 2576 scope.go:117] "RemoveContainer" containerID="27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2" Apr 16 22:37:06.131790 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.131773 2576 scope.go:117] "RemoveContainer" containerID="38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b" Apr 16 22:37:06.139860 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.139837 2576 scope.go:117] "RemoveContainer" containerID="e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837" Apr 16 22:37:06.140124 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:06.140103 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837\": container with ID starting with e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837 not found: ID does not exist" containerID="e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837" Apr 16 22:37:06.140191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.140132 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837"} err="failed to get container status \"e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837\": rpc error: code = NotFound desc = could not find container \"e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837\": container with ID starting with e1be6157dda8b48d0cf09fcd26f9d01876b73037811b615afd331c973f7e2837 not found: ID does not exist" Apr 16 22:37:06.140191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.140150 2576 scope.go:117] "RemoveContainer" containerID="f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade" Apr 16 22:37:06.140424 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:06.140402 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade\": container with ID starting with f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade not found: ID does not exist" containerID="f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade" Apr 16 22:37:06.140485 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.140433 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade"} err="failed to get container status \"f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade\": rpc error: code = NotFound desc = could not find container \"f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade\": container with ID starting with f776b88c67653d4fd778b268818ecb4003ea78d9a16d765e23bd74c0b0efcade not found: ID does not exist" Apr 16 22:37:06.140485 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.140457 2576 scope.go:117] "RemoveContainer" containerID="27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2" Apr 16 22:37:06.140709 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:06.140692 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2\": container with ID starting with 27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2 not found: ID does not exist" containerID="27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2" Apr 16 22:37:06.140757 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.140714 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2"} err="failed to get container status \"27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2\": rpc error: code = NotFound desc = could not find container \"27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2\": container with ID starting with 27e1325967183597da2ccda147a2fcc209e9ff3de7ba8f7a76621b59d1ab17e2 not found: ID does not exist" Apr 16 22:37:06.140757 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.140728 2576 scope.go:117] "RemoveContainer" containerID="38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b" Apr 16 22:37:06.140975 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:06.140954 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b\": container with ID starting with 38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b not found: ID does not exist" containerID="38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b" Apr 16 22:37:06.141054 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.140979 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b"} err="failed to get container status \"38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b\": rpc error: code = NotFound desc = could not find container \"38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b\": container with ID starting with 38270106a8406f4241bc95eb7ac865dac906fdc3ba1001edc8ee55ac444bd65b not found: ID does not exist" Apr 16 22:37:06.199109 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.199076 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f34ef695-31ba-4e6c-9b6c-f1825e691637-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:06.199109 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.199105 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f34ef695-31ba-4e6c-9b6c-f1825e691637-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:06.199109 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.199117 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvtf6\" (UniqueName: \"kubernetes.io/projected/f34ef695-31ba-4e6c-9b6c-f1825e691637-kube-api-access-dvtf6\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:06.421606 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.421568 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn"] Apr 16 22:37:06.425363 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:06.425317 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-t2fgn"] Apr 16 22:37:07.453973 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:07.453938 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" path="/var/lib/kubelet/pods/f34ef695-31ba-4e6c-9b6c-f1825e691637/volumes" Apr 16 22:37:12.074730 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:12.074691 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 22:37:22.074858 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:22.074815 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 22:37:32.074447 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:32.074361 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 22:37:42.074647 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:42.074617 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:37:47.283073 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.283040 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29"] Apr 16 22:37:47.283505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.283382 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" containerID="cri-o://d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea" gracePeriod=30 Apr 16 22:37:47.283505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.283436 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kube-rbac-proxy" containerID="cri-o://c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9" gracePeriod=30 Apr 16 22:37:47.381603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381563 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv"] Apr 16 22:37:47.381873 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381860 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-container" Apr 16 22:37:47.381873 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381874 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-container" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381884 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="storage-initializer" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381889 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="storage-initializer" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381898 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-agent" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381906 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-agent" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381917 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381922 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381974 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-container" Apr 16 22:37:47.381978 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381980 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kube-rbac-proxy" Apr 16 22:37:47.382219 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.381989 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f34ef695-31ba-4e6c-9b6c-f1825e691637" containerName="kserve-agent" Apr 16 22:37:47.386254 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.386238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.388553 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.388526 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 16 22:37:47.388683 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.388641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 16 22:37:47.395412 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.395390 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv"] Apr 16 22:37:47.408150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.408130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.408252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.408166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.408252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.408189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.408252 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.408249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggk7\" (UniqueName: \"kubernetes.io/projected/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kube-api-access-8ggk7\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.509574 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.509530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.509751 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.509597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggk7\" (UniqueName: \"kubernetes.io/projected/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kube-api-access-8ggk7\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.509751 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.509661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.509751 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.509704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.509996 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:47.509811 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-runtime-predictor-serving-cert: secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 16 22:37:47.509996 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:47.509882 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls podName:230f6f3c-ab48-4ea2-82c4-d3fe9cb73456 nodeName:}" failed. No retries permitted until 2026-04-16 22:37:48.00986138 +0000 UTC m=+1447.134485664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls") pod "isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" (UID: "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456") : secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 16 22:37:47.510107 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.510058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.510315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.510295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:47.518091 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:47.518063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggk7\" (UniqueName: \"kubernetes.io/projected/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kube-api-access-8ggk7\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:48.013479 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:48.013440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:48.015940 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:48.015915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:48.231023 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:48.230987 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7215289-752a-494b-baf5-dfb0f543b937" containerID="c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9" exitCode=2 Apr 16 22:37:48.231178 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:48.231032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerDied","Data":"c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9"} Apr 16 22:37:48.297257 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:48.297170 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:48.426638 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:48.426605 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv"] Apr 16 22:37:48.429996 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:37:48.429969 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod230f6f3c_ab48_4ea2_82c4_d3fe9cb73456.slice/crio-4a5963e7b8a1dcd4fdf9d7ea7337a8a4bdd84ba7e1fd1d299a49cdbe0e70ab5d WatchSource:0}: Error finding container 4a5963e7b8a1dcd4fdf9d7ea7337a8a4bdd84ba7e1fd1d299a49cdbe0e70ab5d: Status 404 returned error can't find the container with id 4a5963e7b8a1dcd4fdf9d7ea7337a8a4bdd84ba7e1fd1d299a49cdbe0e70ab5d Apr 16 22:37:49.235359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:49.235306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerStarted","Data":"1e2bc471b249d2ed5ef9ad90c2893ba9920e4ee189e25023c25ba4f558a0d80f"} Apr 16 22:37:49.235359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:49.235360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerStarted","Data":"4a5963e7b8a1dcd4fdf9d7ea7337a8a4bdd84ba7e1fd1d299a49cdbe0e70ab5d"} Apr 16 22:37:50.026382 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.026359 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:37:50.130112 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.130024 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7215289-752a-494b-baf5-dfb0f543b937-proxy-tls\") pod \"c7215289-752a-494b-baf5-dfb0f543b937\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " Apr 16 22:37:50.130112 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.130081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndhtc\" (UniqueName: \"kubernetes.io/projected/c7215289-752a-494b-baf5-dfb0f543b937-kube-api-access-ndhtc\") pod \"c7215289-752a-494b-baf5-dfb0f543b937\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " Apr 16 22:37:50.130351 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.130128 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7215289-752a-494b-baf5-dfb0f543b937-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"c7215289-752a-494b-baf5-dfb0f543b937\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " Apr 16 22:37:50.130351 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.130159 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7215289-752a-494b-baf5-dfb0f543b937-kserve-provision-location\") pod \"c7215289-752a-494b-baf5-dfb0f543b937\" (UID: \"c7215289-752a-494b-baf5-dfb0f543b937\") " Apr 16 22:37:50.130557 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.130531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7215289-752a-494b-baf5-dfb0f543b937-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "c7215289-752a-494b-baf5-dfb0f543b937" (UID: "c7215289-752a-494b-baf5-dfb0f543b937"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:37:50.132236 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.132215 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7215289-752a-494b-baf5-dfb0f543b937-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7215289-752a-494b-baf5-dfb0f543b937" (UID: "c7215289-752a-494b-baf5-dfb0f543b937"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:37:50.132359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.132268 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7215289-752a-494b-baf5-dfb0f543b937-kube-api-access-ndhtc" (OuterVolumeSpecName: "kube-api-access-ndhtc") pod "c7215289-752a-494b-baf5-dfb0f543b937" (UID: "c7215289-752a-494b-baf5-dfb0f543b937"). InnerVolumeSpecName "kube-api-access-ndhtc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:37:50.139743 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.139714 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7215289-752a-494b-baf5-dfb0f543b937-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7215289-752a-494b-baf5-dfb0f543b937" (UID: "c7215289-752a-494b-baf5-dfb0f543b937"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:50.231222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.231172 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndhtc\" (UniqueName: \"kubernetes.io/projected/c7215289-752a-494b-baf5-dfb0f543b937-kube-api-access-ndhtc\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:50.231222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.231216 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7215289-752a-494b-baf5-dfb0f543b937-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:50.231222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.231227 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7215289-752a-494b-baf5-dfb0f543b937-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:50.231222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.231237 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7215289-752a-494b-baf5-dfb0f543b937-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:37:50.241106 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.241079 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7215289-752a-494b-baf5-dfb0f543b937" containerID="d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea" exitCode=0 Apr 16 22:37:50.241266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.241163 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" Apr 16 22:37:50.241266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.241173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerDied","Data":"d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea"} Apr 16 22:37:50.241266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.241217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29" event={"ID":"c7215289-752a-494b-baf5-dfb0f543b937","Type":"ContainerDied","Data":"7074827d10a6c7a7b33331733bf8e7454b38bf7e433933a287fc7e7fa78ed3bc"} Apr 16 22:37:50.241266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.241238 2576 scope.go:117] "RemoveContainer" containerID="c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9" Apr 16 22:37:50.249716 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.249701 2576 scope.go:117] "RemoveContainer" containerID="d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea" Apr 16 22:37:50.256586 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.256568 2576 scope.go:117] "RemoveContainer" containerID="ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6" Apr 16 22:37:50.261464 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.261444 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29"] Apr 16 22:37:50.264056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.264034 2576 scope.go:117] "RemoveContainer" containerID="c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9" Apr 16 22:37:50.264427 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:50.264304 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9\": container with ID starting with c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9 not found: ID does not exist" containerID="c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9" Apr 16 22:37:50.264508 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.264447 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9"} err="failed to get container status \"c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9\": rpc error: code = NotFound desc = could not find container \"c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9\": container with ID starting with c8cccfaaa4fddaf4a41735714cec9554572c3595b478a5627b23c845160f7eb9 not found: ID does not exist" Apr 16 22:37:50.264508 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.264473 2576 scope.go:117] "RemoveContainer" containerID="d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea" Apr 16 22:37:50.264751 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:50.264733 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea\": container with ID starting with d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea not found: ID does not exist" containerID="d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea" Apr 16 22:37:50.264806 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.264756 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea"} err="failed to get container status \"d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea\": rpc error: code = NotFound desc = could not find container \"d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea\": container with ID starting with d5dc560d73a755e3f22e4ec0f2fd2aff5c4e67dc252d8986ae91d45e0a9a9cea not found: ID does not exist" Apr 16 22:37:50.264806 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.264771 2576 scope.go:117] "RemoveContainer" containerID="ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6" Apr 16 22:37:50.264890 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.264844 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-6hx29"] Apr 16 22:37:50.264997 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:37:50.264980 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6\": container with ID starting with ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6 not found: ID does not exist" containerID="ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6" Apr 16 22:37:50.265038 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:50.265013 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6"} err="failed to get container status \"ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6\": rpc error: code = NotFound desc = could not find container \"ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6\": container with ID starting with ce01f465f261c507c2f6e03afacb59451c4319eaed2f95acb36fc9b2ce8f8ea6 not found: ID does not exist" Apr 16 22:37:51.453117 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:51.453076 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7215289-752a-494b-baf5-dfb0f543b937" path="/var/lib/kubelet/pods/c7215289-752a-494b-baf5-dfb0f543b937/volumes" Apr 16 22:37:53.252004 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:53.251969 2576 generic.go:358] "Generic (PLEG): container finished" podID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerID="1e2bc471b249d2ed5ef9ad90c2893ba9920e4ee189e25023c25ba4f558a0d80f" exitCode=0 Apr 16 22:37:53.252404 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:53.252047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerDied","Data":"1e2bc471b249d2ed5ef9ad90c2893ba9920e4ee189e25023c25ba4f558a0d80f"} Apr 16 22:37:54.257175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:54.257142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerStarted","Data":"9293228997e634704e78e78c37ca68acdc42b991449f5783741c36c87ade5bd3"} Apr 16 22:37:54.257175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:54.257177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerStarted","Data":"f088872a98a33303bb3ff9e519448ae20d3b2137c7e9f7be69f4653e3a5c7cec"} Apr 16 22:37:54.257621 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:54.257439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:54.257621 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:54.257536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:37:54.258690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:54.258671 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 22:37:54.274509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:54.274461 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podStartSLOduration=7.274447074 podStartE2EDuration="7.274447074s" podCreationTimestamp="2026-04-16 22:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:37:54.273144163 +0000 UTC m=+1453.397768666" watchObservedRunningTime="2026-04-16 22:37:54.274447074 +0000 UTC m=+1453.399071361" Apr 16 22:37:55.261079 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:37:55.261036 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 22:38:00.271624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:00.271591 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:38:00.272129 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:00.272101 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 22:38:10.272780 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:10.272735 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 22:38:20.272530 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:20.272491 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 22:38:30.272485 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:30.272442 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 22:38:40.273192 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:40.273154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:38:48.774748 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.774711 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv"] Apr 16 22:38:48.775210 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.775152 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" containerID="cri-o://f088872a98a33303bb3ff9e519448ae20d3b2137c7e9f7be69f4653e3a5c7cec" gracePeriod=30 Apr 16 22:38:48.775294 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.775194 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kube-rbac-proxy" containerID="cri-o://9293228997e634704e78e78c37ca68acdc42b991449f5783741c36c87ade5bd3" gracePeriod=30 Apr 16 22:38:48.870116 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm"] Apr 16 22:38:48.870409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870397 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="storage-initializer" Apr 16 22:38:48.870459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870410 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="storage-initializer" Apr 16 22:38:48.870459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870429 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kube-rbac-proxy" Apr 16 22:38:48.870459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870435 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kube-rbac-proxy" Apr 16 22:38:48.870459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870442 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" Apr 16 22:38:48.870459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870448 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" Apr 16 22:38:48.870652 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870492 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kube-rbac-proxy" Apr 16 22:38:48.870652 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.870504 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7215289-752a-494b-baf5-dfb0f543b937" containerName="kserve-container" Apr 16 22:38:48.874793 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.874773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:48.877303 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.877282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 16 22:38:48.877429 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.877282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 16 22:38:48.881748 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.881726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm"] Apr 16 22:38:48.899287 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.899261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34d4ba3-ea64-4929-a88b-6139247469ad-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:48.899419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.899302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c34d4ba3-ea64-4929-a88b-6139247469ad-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:48.899419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.899335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c34d4ba3-ea64-4929-a88b-6139247469ad-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:48.899512 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:48.899436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7dc\" (UniqueName: \"kubernetes.io/projected/c34d4ba3-ea64-4929-a88b-6139247469ad-kube-api-access-nj7dc\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.000532 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.000498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34d4ba3-ea64-4929-a88b-6139247469ad-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.000696 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.000543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c34d4ba3-ea64-4929-a88b-6139247469ad-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.000752 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.000705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c34d4ba3-ea64-4929-a88b-6139247469ad-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.000752 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.000742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj7dc\" (UniqueName: \"kubernetes.io/projected/c34d4ba3-ea64-4929-a88b-6139247469ad-kube-api-access-nj7dc\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.000920 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.000900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34d4ba3-ea64-4929-a88b-6139247469ad-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.001179 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.001158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c34d4ba3-ea64-4929-a88b-6139247469ad-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.003105 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.003083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c34d4ba3-ea64-4929-a88b-6139247469ad-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.008126 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.008106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj7dc\" (UniqueName: \"kubernetes.io/projected/c34d4ba3-ea64-4929-a88b-6139247469ad-kube-api-access-nj7dc\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.187150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.187072 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:49.308625 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.308526 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm"] Apr 16 22:38:49.310665 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:38:49.310636 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc34d4ba3_ea64_4929_a88b_6139247469ad.slice/crio-b6607c99092e7e85ba478adfac0b4fa5c427f64a10639be65627cc917772570a WatchSource:0}: Error finding container b6607c99092e7e85ba478adfac0b4fa5c427f64a10639be65627cc917772570a: Status 404 returned error can't find the container with id b6607c99092e7e85ba478adfac0b4fa5c427f64a10639be65627cc917772570a Apr 16 22:38:49.433090 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.433053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerStarted","Data":"a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba"} Apr 16 22:38:49.433090 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.433090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerStarted","Data":"b6607c99092e7e85ba478adfac0b4fa5c427f64a10639be65627cc917772570a"} Apr 16 22:38:49.435017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.434992 2576 generic.go:358] "Generic (PLEG): container finished" podID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerID="9293228997e634704e78e78c37ca68acdc42b991449f5783741c36c87ade5bd3" exitCode=2 Apr 16 22:38:49.435141 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:49.435065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerDied","Data":"9293228997e634704e78e78c37ca68acdc42b991449f5783741c36c87ade5bd3"} Apr 16 22:38:50.262231 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:50.262186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 16 22:38:50.272979 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:50.272946 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 22:38:51.445128 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.445097 2576 generic.go:358] "Generic (PLEG): container finished" podID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerID="f088872a98a33303bb3ff9e519448ae20d3b2137c7e9f7be69f4653e3a5c7cec" exitCode=0 Apr 16 22:38:51.445588 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.445152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerDied","Data":"f088872a98a33303bb3ff9e519448ae20d3b2137c7e9f7be69f4653e3a5c7cec"} Apr 16 22:38:51.525386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.525362 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:38:51.620828 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.620729 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " Apr 16 22:38:51.620828 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.620780 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kserve-provision-location\") pod \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " Apr 16 22:38:51.621056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.620865 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls\") pod \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " Apr 16 22:38:51.621056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.620890 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ggk7\" (UniqueName: \"kubernetes.io/projected/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kube-api-access-8ggk7\") pod \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\" (UID: \"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456\") " Apr 16 22:38:51.621137 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.621082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" (UID: "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:38:51.623023 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.623000 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" (UID: "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:38:51.623152 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.623131 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kube-api-access-8ggk7" (OuterVolumeSpecName: "kube-api-access-8ggk7") pod "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" (UID: "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456"). InnerVolumeSpecName "kube-api-access-8ggk7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:38:51.629524 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.629498 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" (UID: "230f6f3c-ab48-4ea2-82c4-d3fe9cb73456"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:51.722359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.722307 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:38:51.722359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.722351 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:38:51.722359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.722361 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:38:51.722359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:51.722371 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ggk7\" (UniqueName: \"kubernetes.io/projected/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456-kube-api-access-8ggk7\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:38:52.452125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:52.452080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" event={"ID":"230f6f3c-ab48-4ea2-82c4-d3fe9cb73456","Type":"ContainerDied","Data":"4a5963e7b8a1dcd4fdf9d7ea7337a8a4bdd84ba7e1fd1d299a49cdbe0e70ab5d"} Apr 16 22:38:52.452125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:52.452121 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv" Apr 16 22:38:52.452615 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:52.452138 2576 scope.go:117] "RemoveContainer" containerID="9293228997e634704e78e78c37ca68acdc42b991449f5783741c36c87ade5bd3" Apr 16 22:38:52.460764 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:52.460742 2576 scope.go:117] "RemoveContainer" containerID="f088872a98a33303bb3ff9e519448ae20d3b2137c7e9f7be69f4653e3a5c7cec" Apr 16 22:38:52.468404 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:52.468384 2576 scope.go:117] "RemoveContainer" containerID="1e2bc471b249d2ed5ef9ad90c2893ba9920e4ee189e25023c25ba4f558a0d80f" Apr 16 22:38:52.474065 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:52.474040 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv"] Apr 16 22:38:52.477833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:52.477812 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-m76wv"] Apr 16 22:38:53.454123 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:53.454089 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" path="/var/lib/kubelet/pods/230f6f3c-ab48-4ea2-82c4-d3fe9cb73456/volumes" Apr 16 22:38:54.460261 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:54.460230 2576 generic.go:358] "Generic (PLEG): container finished" podID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerID="a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba" exitCode=0 Apr 16 22:38:54.460684 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:54.460301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerDied","Data":"a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba"} Apr 16 22:38:55.465036 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:55.465001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerStarted","Data":"04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c"} Apr 16 22:38:55.465422 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:55.465045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerStarted","Data":"33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae"} Apr 16 22:38:55.465422 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:55.465239 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:55.485687 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:55.485634 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podStartSLOduration=7.485622418 podStartE2EDuration="7.485622418s" podCreationTimestamp="2026-04-16 22:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:38:55.484383311 +0000 UTC m=+1514.609007601" watchObservedRunningTime="2026-04-16 22:38:55.485622418 +0000 UTC m=+1514.610246702" Apr 16 22:38:56.468114 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:56.468081 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:38:56.469449 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:56.469420 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 22:38:57.474010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:38:57.473970 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 22:39:02.478619 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:02.478587 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:39:02.481002 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:02.479185 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 22:39:12.480151 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:12.480102 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 22:39:22.479510 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:22.479469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 22:39:32.479830 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:32.479788 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 22:39:42.480164 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:42.480136 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:39:50.567642 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.567600 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm"] Apr 16 22:39:50.568271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.568212 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" containerID="cri-o://33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae" gracePeriod=30 Apr 16 22:39:50.568455 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.568289 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kube-rbac-proxy" containerID="cri-o://04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c" gracePeriod=30 Apr 16 22:39:50.676175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676138 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg"] Apr 16 22:39:50.676539 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676523 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" Apr 16 22:39:50.676539 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676539 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" Apr 16 22:39:50.676691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676552 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="storage-initializer" Apr 16 22:39:50.676691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676558 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="storage-initializer" Apr 16 22:39:50.676691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676564 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kube-rbac-proxy" Apr 16 22:39:50.676691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676572 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kube-rbac-proxy" Apr 16 22:39:50.676691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676638 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kube-rbac-proxy" Apr 16 22:39:50.676691 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.676650 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="230f6f3c-ab48-4ea2-82c4-d3fe9cb73456" containerName="kserve-container" Apr 16 22:39:50.680966 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.680943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.683410 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.683385 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 16 22:39:50.683771 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.683755 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 16 22:39:50.691705 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.691680 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg"] Apr 16 22:39:50.694714 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.694689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnrgg\" (UniqueName: \"kubernetes.io/projected/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kube-api-access-nnrgg\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.694811 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.694746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2c955-b726-47e4-9bdf-e1f15bfc9341-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.694852 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.694813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.694889 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.694853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.795549 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.795516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2c955-b726-47e4-9bdf-e1f15bfc9341-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.795549 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.795562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.795782 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.795586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.795782 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.795698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnrgg\" (UniqueName: \"kubernetes.io/projected/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kube-api-access-nnrgg\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.795782 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:39:50.795709 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-predictor-serving-cert: secret "isvc-pmml-predictor-serving-cert" not found Apr 16 22:39:50.795895 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:39:50.795786 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls podName:cba2c955-b726-47e4-9bdf-e1f15bfc9341 nodeName:}" failed. No retries permitted until 2026-04-16 22:39:51.295765294 +0000 UTC m=+1570.420389579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls") pod "isvc-pmml-predictor-8bb578669-p57jg" (UID: "cba2c955-b726-47e4-9bdf-e1f15bfc9341") : secret "isvc-pmml-predictor-serving-cert" not found Apr 16 22:39:50.795993 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.795971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.796222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.796203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2c955-b726-47e4-9bdf-e1f15bfc9341-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:50.805498 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:50.805472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnrgg\" (UniqueName: \"kubernetes.io/projected/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kube-api-access-nnrgg\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:51.300638 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:51.300592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:51.303101 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:51.303069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-p57jg\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:51.592264 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:51.592168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:39:51.648062 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:51.648021 2576 generic.go:358] "Generic (PLEG): container finished" podID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerID="04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c" exitCode=2 Apr 16 22:39:51.648212 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:51.648094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerDied","Data":"04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c"} Apr 16 22:39:51.714056 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:51.714023 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg"] Apr 16 22:39:51.716782 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:39:51.716752 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba2c955_b726_47e4_9bdf_e1f15bfc9341.slice/crio-46625a5b57a9a15526bd729b13a73b39f1090d6d11f0c4fdf634549054ef0576 WatchSource:0}: Error finding container 46625a5b57a9a15526bd729b13a73b39f1090d6d11f0c4fdf634549054ef0576: Status 404 returned error can't find the container with id 46625a5b57a9a15526bd729b13a73b39f1090d6d11f0c4fdf634549054ef0576 Apr 16 22:39:52.474663 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:52.474623 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 16 22:39:52.479114 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:52.479088 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 22:39:52.655578 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:52.655539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerStarted","Data":"2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b"} Apr 16 22:39:52.655578 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:52.655583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerStarted","Data":"46625a5b57a9a15526bd729b13a73b39f1090d6d11f0c4fdf634549054ef0576"} Apr 16 22:39:53.414282 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.414259 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:39:53.520726 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.520634 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c34d4ba3-ea64-4929-a88b-6139247469ad-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"c34d4ba3-ea64-4929-a88b-6139247469ad\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " Apr 16 22:39:53.520726 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.520680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c34d4ba3-ea64-4929-a88b-6139247469ad-proxy-tls\") pod \"c34d4ba3-ea64-4929-a88b-6139247469ad\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " Apr 16 22:39:53.520726 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.520716 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj7dc\" (UniqueName: \"kubernetes.io/projected/c34d4ba3-ea64-4929-a88b-6139247469ad-kube-api-access-nj7dc\") pod \"c34d4ba3-ea64-4929-a88b-6139247469ad\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " Apr 16 22:39:53.520968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.520764 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34d4ba3-ea64-4929-a88b-6139247469ad-kserve-provision-location\") pod \"c34d4ba3-ea64-4929-a88b-6139247469ad\" (UID: \"c34d4ba3-ea64-4929-a88b-6139247469ad\") " Apr 16 22:39:53.521121 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.521084 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c34d4ba3-ea64-4929-a88b-6139247469ad-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "c34d4ba3-ea64-4929-a88b-6139247469ad" (UID: "c34d4ba3-ea64-4929-a88b-6139247469ad"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:39:53.522890 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.522869 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34d4ba3-ea64-4929-a88b-6139247469ad-kube-api-access-nj7dc" (OuterVolumeSpecName: "kube-api-access-nj7dc") pod "c34d4ba3-ea64-4929-a88b-6139247469ad" (UID: "c34d4ba3-ea64-4929-a88b-6139247469ad"). InnerVolumeSpecName "kube-api-access-nj7dc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:39:53.522976 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.522915 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34d4ba3-ea64-4929-a88b-6139247469ad-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c34d4ba3-ea64-4929-a88b-6139247469ad" (UID: "c34d4ba3-ea64-4929-a88b-6139247469ad"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:39:53.530904 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.530871 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34d4ba3-ea64-4929-a88b-6139247469ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c34d4ba3-ea64-4929-a88b-6139247469ad" (UID: "c34d4ba3-ea64-4929-a88b-6139247469ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:53.621492 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.621452 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nj7dc\" (UniqueName: \"kubernetes.io/projected/c34d4ba3-ea64-4929-a88b-6139247469ad-kube-api-access-nj7dc\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:39:53.621492 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.621485 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34d4ba3-ea64-4929-a88b-6139247469ad-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:39:53.621492 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.621495 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c34d4ba3-ea64-4929-a88b-6139247469ad-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:39:53.621798 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.621506 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c34d4ba3-ea64-4929-a88b-6139247469ad-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:39:53.661386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.661348 2576 generic.go:358] "Generic (PLEG): container finished" podID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerID="33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae" exitCode=0 Apr 16 22:39:53.661777 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.661426 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" Apr 16 22:39:53.661777 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.661429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerDied","Data":"33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae"} Apr 16 22:39:53.661777 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.661473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm" event={"ID":"c34d4ba3-ea64-4929-a88b-6139247469ad","Type":"ContainerDied","Data":"b6607c99092e7e85ba478adfac0b4fa5c427f64a10639be65627cc917772570a"} Apr 16 22:39:53.661777 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.661486 2576 scope.go:117] "RemoveContainer" containerID="04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c" Apr 16 22:39:53.669490 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.669475 2576 scope.go:117] "RemoveContainer" containerID="33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae" Apr 16 22:39:53.676340 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.676303 2576 scope.go:117] "RemoveContainer" containerID="a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba" Apr 16 22:39:53.682112 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.682090 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm"] Apr 16 22:39:53.684055 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.684031 2576 scope.go:117] "RemoveContainer" containerID="04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c" Apr 16 22:39:53.684307 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:39:53.684289 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c\": container with ID starting with 04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c not found: ID does not exist" containerID="04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c" Apr 16 22:39:53.684384 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.684318 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c"} err="failed to get container status \"04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c\": rpc error: code = NotFound desc = could not find container \"04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c\": container with ID starting with 04ee223f138f5a4553640f57785c6d6ee8f5ea5a62ce2ab9cc9107af2aa9737c not found: ID does not exist" Apr 16 22:39:53.684384 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.684352 2576 scope.go:117] "RemoveContainer" containerID="33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae" Apr 16 22:39:53.684577 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:39:53.684558 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae\": container with ID starting with 33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae not found: ID does not exist" containerID="33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae" Apr 16 22:39:53.684619 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.684582 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae"} err="failed to get container status \"33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae\": rpc error: code = NotFound desc = could not find container \"33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae\": container with ID starting with 33d62ba7c0b397e6fc7677971b81ccea2f31c5575efd5199846414c157907bae not found: ID does not exist" Apr 16 22:39:53.684619 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.684596 2576 scope.go:117] "RemoveContainer" containerID="a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba" Apr 16 22:39:53.684842 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:39:53.684824 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba\": container with ID starting with a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba not found: ID does not exist" containerID="a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba" Apr 16 22:39:53.684883 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.684848 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba"} err="failed to get container status \"a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba\": rpc error: code = NotFound desc = could not find container \"a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba\": container with ID starting with a07139314b6d543f70ce9a1920f2c91a21c244858d543ed0a6ef1376836738ba not found: ID does not exist" Apr 16 22:39:53.687960 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:53.687935 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-t4fqm"] Apr 16 22:39:55.454624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:55.454589 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" path="/var/lib/kubelet/pods/c34d4ba3-ea64-4929-a88b-6139247469ad/volumes" Apr 16 22:39:55.670464 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:55.670430 2576 generic.go:358] "Generic (PLEG): container finished" podID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerID="2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b" exitCode=0 Apr 16 22:39:55.670654 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:39:55.670504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerDied","Data":"2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b"} Apr 16 22:40:03.703092 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:03.703058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerStarted","Data":"698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1"} Apr 16 22:40:03.703092 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:03.703097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerStarted","Data":"cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07"} Apr 16 22:40:03.703575 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:03.703356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:40:03.703575 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:03.703388 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:40:03.704749 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:03.704723 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:40:03.721728 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:03.721686 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podStartSLOduration=6.74857134 podStartE2EDuration="13.721674542s" podCreationTimestamp="2026-04-16 22:39:50 +0000 UTC" firstStartedPulling="2026-04-16 22:39:55.671767859 +0000 UTC m=+1574.796392123" lastFinishedPulling="2026-04-16 22:40:02.644871055 +0000 UTC m=+1581.769495325" observedRunningTime="2026-04-16 22:40:03.720494385 +0000 UTC m=+1582.845118672" watchObservedRunningTime="2026-04-16 22:40:03.721674542 +0000 UTC m=+1582.846298828" Apr 16 22:40:04.706848 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:04.706800 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:40:09.711699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:09.711668 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:40:09.712286 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:09.712256 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:40:19.713013 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:19.712976 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:40:29.713085 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:29.712996 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:40:39.712508 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:39.712462 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:40:49.712640 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:49.712595 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:40:59.712867 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:40:59.712828 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:41:09.712656 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:09.712613 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:41:19.712499 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:19.712456 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 22:41:29.713288 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:29.713259 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:41:32.108388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.108356 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg"] Apr 16 22:41:32.108777 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.108685 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" containerID="cri-o://cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07" gracePeriod=30 Apr 16 22:41:32.108830 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.108734 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kube-rbac-proxy" containerID="cri-o://698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1" gracePeriod=30 Apr 16 22:41:32.219300 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219258 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr"] Apr 16 22:41:32.219748 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219729 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="storage-initializer" Apr 16 22:41:32.219815 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219751 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="storage-initializer" Apr 16 22:41:32.219815 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219764 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kube-rbac-proxy" Apr 16 22:41:32.219815 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219774 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kube-rbac-proxy" Apr 16 22:41:32.219815 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219808 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" Apr 16 22:41:32.219941 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219818 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" Apr 16 22:41:32.219941 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219893 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kserve-container" Apr 16 22:41:32.219941 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.219911 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c34d4ba3-ea64-4929-a88b-6139247469ad" containerName="kube-rbac-proxy" Apr 16 22:41:32.223213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.223194 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.225601 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.225580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 16 22:41:32.225700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.225627 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 16 22:41:32.233243 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.233221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr"] Apr 16 22:41:32.320805 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.320776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.320805 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.320806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqldw\" (UniqueName: \"kubernetes.io/projected/847fede3-3aef-4b60-a0d6-16d65f829b7f-kube-api-access-kqldw\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.320987 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.320852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/847fede3-3aef-4b60-a0d6-16d65f829b7f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.320987 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.320884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/847fede3-3aef-4b60-a0d6-16d65f829b7f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.421251 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.421170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.421251 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.421203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqldw\" (UniqueName: \"kubernetes.io/projected/847fede3-3aef-4b60-a0d6-16d65f829b7f-kube-api-access-kqldw\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.421513 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.421253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/847fede3-3aef-4b60-a0d6-16d65f829b7f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.421513 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.421305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/847fede3-3aef-4b60-a0d6-16d65f829b7f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.421513 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:41:32.421346 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-runtime-predictor-serving-cert: secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 16 22:41:32.421513 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:41:32.421416 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls podName:847fede3-3aef-4b60-a0d6-16d65f829b7f nodeName:}" failed. No retries permitted until 2026-04-16 22:41:32.921396075 +0000 UTC m=+1672.046020356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls") pod "isvc-pmml-runtime-predictor-67bc544947-dkqsr" (UID: "847fede3-3aef-4b60-a0d6-16d65f829b7f") : secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 16 22:41:32.421748 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.421725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/847fede3-3aef-4b60-a0d6-16d65f829b7f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.421987 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.421965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/847fede3-3aef-4b60-a0d6-16d65f829b7f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.431766 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.431739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqldw\" (UniqueName: \"kubernetes.io/projected/847fede3-3aef-4b60-a0d6-16d65f829b7f-kube-api-access-kqldw\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.925391 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.925350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.927678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.927660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-dkqsr\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:32.985765 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.985730 2576 generic.go:358] "Generic (PLEG): container finished" podID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerID="698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1" exitCode=2 Apr 16 22:41:32.985765 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:32.985767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerDied","Data":"698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1"} Apr 16 22:41:33.134674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:33.134632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:33.258995 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:33.258967 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr"] Apr 16 22:41:33.261065 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:41:33.261031 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod847fede3_3aef_4b60_a0d6_16d65f829b7f.slice/crio-e52f66307013ab601b4fe28c2f896d26b4046ba55fa9322ec1f72a59384e6552 WatchSource:0}: Error finding container e52f66307013ab601b4fe28c2f896d26b4046ba55fa9322ec1f72a59384e6552: Status 404 returned error can't find the container with id e52f66307013ab601b4fe28c2f896d26b4046ba55fa9322ec1f72a59384e6552 Apr 16 22:41:33.990775 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:33.990741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerStarted","Data":"55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6"} Apr 16 22:41:33.990775 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:33.990777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerStarted","Data":"e52f66307013ab601b4fe28c2f896d26b4046ba55fa9322ec1f72a59384e6552"} Apr 16 22:41:34.708038 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:34.707995 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.41:8643/healthz\": dial tcp 10.133.0.41:8643: connect: connection refused" Apr 16 22:41:35.660041 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.660021 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:41:35.748459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.748376 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls\") pod \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " Apr 16 22:41:35.748459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.748438 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kserve-provision-location\") pod \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " Apr 16 22:41:35.748459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.748459 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2c955-b726-47e4-9bdf-e1f15bfc9341-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " Apr 16 22:41:35.748961 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.748508 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnrgg\" (UniqueName: \"kubernetes.io/projected/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kube-api-access-nnrgg\") pod \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\" (UID: \"cba2c955-b726-47e4-9bdf-e1f15bfc9341\") " Apr 16 22:41:35.748961 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.748756 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cba2c955-b726-47e4-9bdf-e1f15bfc9341" (UID: "cba2c955-b726-47e4-9bdf-e1f15bfc9341"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:41:35.748961 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.748775 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba2c955-b726-47e4-9bdf-e1f15bfc9341-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "cba2c955-b726-47e4-9bdf-e1f15bfc9341" (UID: "cba2c955-b726-47e4-9bdf-e1f15bfc9341"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:41:35.750431 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.750405 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kube-api-access-nnrgg" (OuterVolumeSpecName: "kube-api-access-nnrgg") pod "cba2c955-b726-47e4-9bdf-e1f15bfc9341" (UID: "cba2c955-b726-47e4-9bdf-e1f15bfc9341"). InnerVolumeSpecName "kube-api-access-nnrgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:41:35.750536 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.750491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cba2c955-b726-47e4-9bdf-e1f15bfc9341" (UID: "cba2c955-b726-47e4-9bdf-e1f15bfc9341"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:41:35.849021 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.848992 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nnrgg\" (UniqueName: \"kubernetes.io/projected/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kube-api-access-nnrgg\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:41:35.849021 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.849017 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2c955-b726-47e4-9bdf-e1f15bfc9341-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:41:35.849197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.849028 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cba2c955-b726-47e4-9bdf-e1f15bfc9341-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:41:35.849197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:35.849038 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2c955-b726-47e4-9bdf-e1f15bfc9341-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:41:36.001413 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.001308 2576 generic.go:358] "Generic (PLEG): container finished" podID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerID="cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07" exitCode=0 Apr 16 22:41:36.001413 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.001400 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" Apr 16 22:41:36.001581 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.001394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerDied","Data":"cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07"} Apr 16 22:41:36.001581 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.001514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg" event={"ID":"cba2c955-b726-47e4-9bdf-e1f15bfc9341","Type":"ContainerDied","Data":"46625a5b57a9a15526bd729b13a73b39f1090d6d11f0c4fdf634549054ef0576"} Apr 16 22:41:36.001581 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.001531 2576 scope.go:117] "RemoveContainer" containerID="698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1" Apr 16 22:41:36.009484 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.009468 2576 scope.go:117] "RemoveContainer" containerID="cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07" Apr 16 22:41:36.016490 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.016476 2576 scope.go:117] "RemoveContainer" containerID="2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b" Apr 16 22:41:36.022163 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.022142 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg"] Apr 16 22:41:36.023309 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.023283 2576 scope.go:117] "RemoveContainer" containerID="698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1" Apr 16 22:41:36.023586 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:41:36.023568 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1\": container with ID starting with 698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1 not found: ID does not exist" containerID="698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1" Apr 16 22:41:36.023651 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.023595 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1"} err="failed to get container status \"698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1\": rpc error: code = NotFound desc = could not find container \"698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1\": container with ID starting with 698b8be1eeec6386603e59b6ce968c56733d0b9157c0969cc974d6ac3bbaedf1 not found: ID does not exist" Apr 16 22:41:36.023651 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.023613 2576 scope.go:117] "RemoveContainer" containerID="cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07" Apr 16 22:41:36.023896 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:41:36.023878 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07\": container with ID starting with cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07 not found: ID does not exist" containerID="cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07" Apr 16 22:41:36.024124 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.023910 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07"} err="failed to get container status \"cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07\": rpc error: code = NotFound desc = could not find container \"cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07\": container with ID starting with cf398909b1a223e20bcc39c2a092df430f41d886c9d824f1de5c3b490e870c07 not found: ID does not exist" Apr 16 22:41:36.024124 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.023931 2576 scope.go:117] "RemoveContainer" containerID="2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b" Apr 16 22:41:36.024426 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:41:36.024393 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b\": container with ID starting with 2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b not found: ID does not exist" containerID="2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b" Apr 16 22:41:36.024537 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.024431 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b"} err="failed to get container status \"2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b\": rpc error: code = NotFound desc = could not find container \"2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b\": container with ID starting with 2023ed1135ea348b91189900f8f95a01790b06888842d1bd042a23c4bf50428b not found: ID does not exist" Apr 16 22:41:36.026419 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:36.026397 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-p57jg"] Apr 16 22:41:37.006167 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:37.006135 2576 generic.go:358] "Generic (PLEG): container finished" podID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerID="55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6" exitCode=0 Apr 16 22:41:37.006613 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:37.006211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerDied","Data":"55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6"} Apr 16 22:41:37.007501 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:37.007483 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:41:37.453421 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:37.453380 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" path="/var/lib/kubelet/pods/cba2c955-b726-47e4-9bdf-e1f15bfc9341/volumes" Apr 16 22:41:38.012270 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:38.012240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerStarted","Data":"df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59"} Apr 16 22:41:38.012645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:38.012279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerStarted","Data":"41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1"} Apr 16 22:41:38.012645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:38.012508 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:38.031383 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:38.031336 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podStartSLOduration=6.031303027 podStartE2EDuration="6.031303027s" podCreationTimestamp="2026-04-16 22:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:41:38.030154206 +0000 UTC m=+1677.154778493" watchObservedRunningTime="2026-04-16 22:41:38.031303027 +0000 UTC m=+1677.155927319" Apr 16 22:41:39.015014 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:39.014979 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:39.016259 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:39.016228 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:41:40.018896 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:40.018850 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:41:45.023445 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:45.023413 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:41:45.024060 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:45.024029 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:41:55.024399 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:41:55.024359 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:42:05.024400 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:42:05.024292 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:42:15.024100 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:42:15.024057 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:42:25.024118 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:42:25.024076 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:42:35.024207 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:42:35.024169 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:42:45.024388 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:42:45.024344 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:42:45.449344 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:42:45.449284 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:42:55.452783 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:42:55.452751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:43:03.326107 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.326071 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr"] Apr 16 22:43:03.326660 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.326461 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" containerID="cri-o://41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1" gracePeriod=30 Apr 16 22:43:03.326660 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.326604 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kube-rbac-proxy" containerID="cri-o://df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59" gracePeriod=30 Apr 16 22:43:03.436911 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.436876 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s"] Apr 16 22:43:03.437216 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437192 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" Apr 16 22:43:03.437216 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437207 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" Apr 16 22:43:03.437216 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437218 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="storage-initializer" Apr 16 22:43:03.437432 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437224 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="storage-initializer" Apr 16 22:43:03.437432 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437243 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kube-rbac-proxy" Apr 16 22:43:03.437432 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437248 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kube-rbac-proxy" Apr 16 22:43:03.437432 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437293 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kube-rbac-proxy" Apr 16 22:43:03.437432 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.437301 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cba2c955-b726-47e4-9bdf-e1f15bfc9341" containerName="kserve-container" Apr 16 22:43:03.440079 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.440062 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.442631 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.442611 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 16 22:43:03.442716 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.442612 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 16 22:43:03.454544 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.454522 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s"] Apr 16 22:43:03.635504 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.635426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dda0514-d906-4dd9-8cb5-4520e36ca894-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.635659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.635521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dda0514-d906-4dd9-8cb5-4520e36ca894-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.635659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.635552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-296sx\" (UniqueName: \"kubernetes.io/projected/3dda0514-d906-4dd9-8cb5-4520e36ca894-kube-api-access-296sx\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.635800 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.635669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dda0514-d906-4dd9-8cb5-4520e36ca894-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.736422 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.736384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dda0514-d906-4dd9-8cb5-4520e36ca894-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.736618 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.736441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dda0514-d906-4dd9-8cb5-4520e36ca894-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.736618 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.736476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dda0514-d906-4dd9-8cb5-4520e36ca894-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.736618 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.736498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-296sx\" (UniqueName: \"kubernetes.io/projected/3dda0514-d906-4dd9-8cb5-4520e36ca894-kube-api-access-296sx\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.736912 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.736886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dda0514-d906-4dd9-8cb5-4520e36ca894-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.737129 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.737105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dda0514-d906-4dd9-8cb5-4520e36ca894-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.739052 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.739028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dda0514-d906-4dd9-8cb5-4520e36ca894-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.745054 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.745028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-296sx\" (UniqueName: \"kubernetes.io/projected/3dda0514-d906-4dd9-8cb5-4520e36ca894-kube-api-access-296sx\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.751774 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.751749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:03.893089 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:03.893058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s"] Apr 16 22:43:03.894288 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:43:03.894264 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dda0514_d906_4dd9_8cb5_4520e36ca894.slice/crio-c27014c7d96bc1b72a312981caa9169a46f7381872bff80f055ded1b1cc93092 WatchSource:0}: Error finding container c27014c7d96bc1b72a312981caa9169a46f7381872bff80f055ded1b1cc93092: Status 404 returned error can't find the container with id c27014c7d96bc1b72a312981caa9169a46f7381872bff80f055ded1b1cc93092 Apr 16 22:43:04.290258 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:04.290218 2576 generic.go:358] "Generic (PLEG): container finished" podID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerID="df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59" exitCode=2 Apr 16 22:43:04.290447 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:04.290291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerDied","Data":"df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59"} Apr 16 22:43:04.291652 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:04.291629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerStarted","Data":"89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2"} Apr 16 22:43:04.291773 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:04.291659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerStarted","Data":"c27014c7d96bc1b72a312981caa9169a46f7381872bff80f055ded1b1cc93092"} Apr 16 22:43:05.019703 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:05.019656 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 16 22:43:05.449502 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:05.449459 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 22:43:06.967866 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:06.967842 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:43:07.063529 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.063450 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls\") pod \"847fede3-3aef-4b60-a0d6-16d65f829b7f\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " Apr 16 22:43:07.063529 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.063493 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/847fede3-3aef-4b60-a0d6-16d65f829b7f-kserve-provision-location\") pod \"847fede3-3aef-4b60-a0d6-16d65f829b7f\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " Apr 16 22:43:07.063711 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.063540 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/847fede3-3aef-4b60-a0d6-16d65f829b7f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"847fede3-3aef-4b60-a0d6-16d65f829b7f\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " Apr 16 22:43:07.063711 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.063604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqldw\" (UniqueName: \"kubernetes.io/projected/847fede3-3aef-4b60-a0d6-16d65f829b7f-kube-api-access-kqldw\") pod \"847fede3-3aef-4b60-a0d6-16d65f829b7f\" (UID: \"847fede3-3aef-4b60-a0d6-16d65f829b7f\") " Apr 16 22:43:07.063841 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.063816 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847fede3-3aef-4b60-a0d6-16d65f829b7f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "847fede3-3aef-4b60-a0d6-16d65f829b7f" (UID: "847fede3-3aef-4b60-a0d6-16d65f829b7f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:43:07.063942 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.063917 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/847fede3-3aef-4b60-a0d6-16d65f829b7f-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "847fede3-3aef-4b60-a0d6-16d65f829b7f" (UID: "847fede3-3aef-4b60-a0d6-16d65f829b7f"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:43:07.065670 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.065645 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847fede3-3aef-4b60-a0d6-16d65f829b7f-kube-api-access-kqldw" (OuterVolumeSpecName: "kube-api-access-kqldw") pod "847fede3-3aef-4b60-a0d6-16d65f829b7f" (UID: "847fede3-3aef-4b60-a0d6-16d65f829b7f"). InnerVolumeSpecName "kube-api-access-kqldw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:43:07.065773 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.065725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "847fede3-3aef-4b60-a0d6-16d65f829b7f" (UID: "847fede3-3aef-4b60-a0d6-16d65f829b7f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:43:07.164758 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.164728 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/847fede3-3aef-4b60-a0d6-16d65f829b7f-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:43:07.164758 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.164752 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/847fede3-3aef-4b60-a0d6-16d65f829b7f-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:43:07.164758 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.164762 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/847fede3-3aef-4b60-a0d6-16d65f829b7f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:43:07.164983 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.164772 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqldw\" (UniqueName: \"kubernetes.io/projected/847fede3-3aef-4b60-a0d6-16d65f829b7f-kube-api-access-kqldw\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:43:07.304145 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.304114 2576 generic.go:358] "Generic (PLEG): container finished" podID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerID="41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1" exitCode=0 Apr 16 22:43:07.304298 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.304176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerDied","Data":"41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1"} Apr 16 22:43:07.304298 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.304206 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" event={"ID":"847fede3-3aef-4b60-a0d6-16d65f829b7f","Type":"ContainerDied","Data":"e52f66307013ab601b4fe28c2f896d26b4046ba55fa9322ec1f72a59384e6552"} Apr 16 22:43:07.304298 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.304203 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr" Apr 16 22:43:07.304298 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.304218 2576 scope.go:117] "RemoveContainer" containerID="df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59" Apr 16 22:43:07.312110 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.312089 2576 scope.go:117] "RemoveContainer" containerID="41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1" Apr 16 22:43:07.318836 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.318817 2576 scope.go:117] "RemoveContainer" containerID="55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6" Apr 16 22:43:07.324695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.324671 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr"] Apr 16 22:43:07.326077 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.326064 2576 scope.go:117] "RemoveContainer" containerID="df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59" Apr 16 22:43:07.326403 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:43:07.326383 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59\": container with ID starting with df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59 not found: ID does not exist" containerID="df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59" Apr 16 22:43:07.326459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.326414 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59"} err="failed to get container status \"df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59\": rpc error: code = NotFound desc = could not find container \"df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59\": container with ID starting with df7343811a5c184ef900cadf09c88a43fc7c87e9c3740927a0670e457f92fb59 not found: ID does not exist" Apr 16 22:43:07.326459 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.326431 2576 scope.go:117] "RemoveContainer" containerID="41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1" Apr 16 22:43:07.326678 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:43:07.326660 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1\": container with ID starting with 41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1 not found: ID does not exist" containerID="41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1" Apr 16 22:43:07.326724 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.326685 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1"} err="failed to get container status \"41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1\": rpc error: code = NotFound desc = could not find container \"41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1\": container with ID starting with 41de16f2517340d2cfb909c76425fdda7bde668c09e88fd971f17c5184f4c8c1 not found: ID does not exist" Apr 16 22:43:07.326724 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.326701 2576 scope.go:117] "RemoveContainer" containerID="55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6" Apr 16 22:43:07.326916 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:43:07.326901 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6\": container with ID starting with 55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6 not found: ID does not exist" containerID="55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6" Apr 16 22:43:07.326958 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.326920 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6"} err="failed to get container status \"55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6\": rpc error: code = NotFound desc = could not find container \"55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6\": container with ID starting with 55dad73d494a46acf7b32207daa0d9ac7e7fd1d93cadc5e789ff8ad1b1898eb6 not found: ID does not exist" Apr 16 22:43:07.329736 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.329716 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dkqsr"] Apr 16 22:43:07.453299 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:07.453262 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" path="/var/lib/kubelet/pods/847fede3-3aef-4b60-a0d6-16d65f829b7f/volumes" Apr 16 22:43:08.308485 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:08.308436 2576 generic.go:358] "Generic (PLEG): container finished" podID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerID="89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2" exitCode=0 Apr 16 22:43:08.308923 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:08.308509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerDied","Data":"89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2"} Apr 16 22:43:09.315016 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:09.314978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerStarted","Data":"dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177"} Apr 16 22:43:09.315016 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:09.315021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerStarted","Data":"08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3"} Apr 16 22:43:09.315559 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:09.315245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:09.334522 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:09.334468 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podStartSLOduration=6.334448589 podStartE2EDuration="6.334448589s" podCreationTimestamp="2026-04-16 22:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:43:09.333737572 +0000 UTC m=+1768.458361860" watchObservedRunningTime="2026-04-16 22:43:09.334448589 +0000 UTC m=+1768.459072875" Apr 16 22:43:10.318465 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:10.318424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:10.319889 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:10.319859 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:43:11.321858 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:11.321821 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:43:16.325731 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:16.325699 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:43:16.326217 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:16.326143 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:43:26.326147 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:26.326108 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:43:36.326120 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:36.326080 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:43:46.326313 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:46.326273 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:43:56.327068 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:43:56.327030 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:44:06.326993 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:06.326948 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:44:16.326502 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:16.326459 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:44:26.327008 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:26.326974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:44:34.395903 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.395862 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s"] Apr 16 22:44:34.396430 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.396223 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" containerID="cri-o://08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3" gracePeriod=30 Apr 16 22:44:34.396430 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.396263 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kube-rbac-proxy" containerID="cri-o://dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177" gracePeriod=30 Apr 16 22:44:34.509080 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509054 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm"] Apr 16 22:44:34.509395 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509381 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="storage-initializer" Apr 16 22:44:34.509456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509396 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="storage-initializer" Apr 16 22:44:34.509456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509410 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" Apr 16 22:44:34.509456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509416 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" Apr 16 22:44:34.509456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509424 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kube-rbac-proxy" Apr 16 22:44:34.509456 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509430 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kube-rbac-proxy" Apr 16 22:44:34.509641 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509480 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kserve-container" Apr 16 22:44:34.509641 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.509499 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="847fede3-3aef-4b60-a0d6-16d65f829b7f" containerName="kube-rbac-proxy" Apr 16 22:44:34.512414 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.512392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.516466 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.516432 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-0d7654-kube-rbac-proxy-sar-config\"" Apr 16 22:44:34.516725 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.516708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-0d7654-predictor-serving-cert\"" Apr 16 22:44:34.522675 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.522652 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm"] Apr 16 22:44:34.563403 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.563372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5fa030ed-f732-420e-844c-db1e7c668faa-isvc-primary-0d7654-kube-rbac-proxy-sar-config\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.563547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.563416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fa030ed-f732-420e-844c-db1e7c668faa-kserve-provision-location\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.563547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.563457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5d8\" (UniqueName: \"kubernetes.io/projected/5fa030ed-f732-420e-844c-db1e7c668faa-kube-api-access-zh5d8\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.563547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.563508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fa030ed-f732-420e-844c-db1e7c668faa-proxy-tls\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.591236 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.591204 2576 generic.go:358] "Generic (PLEG): container finished" podID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerID="dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177" exitCode=2 Apr 16 22:44:34.591397 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.591273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerDied","Data":"dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177"} Apr 16 22:44:34.664369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.664264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5fa030ed-f732-420e-844c-db1e7c668faa-isvc-primary-0d7654-kube-rbac-proxy-sar-config\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.664369 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.664310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fa030ed-f732-420e-844c-db1e7c668faa-kserve-provision-location\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.664605 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.664382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5d8\" (UniqueName: \"kubernetes.io/projected/5fa030ed-f732-420e-844c-db1e7c668faa-kube-api-access-zh5d8\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.664605 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.664405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fa030ed-f732-420e-844c-db1e7c668faa-proxy-tls\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.664793 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.664774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fa030ed-f732-420e-844c-db1e7c668faa-kserve-provision-location\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.664962 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.664928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5fa030ed-f732-420e-844c-db1e7c668faa-isvc-primary-0d7654-kube-rbac-proxy-sar-config\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.666793 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.666771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fa030ed-f732-420e-844c-db1e7c668faa-proxy-tls\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.672629 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.672604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5d8\" (UniqueName: \"kubernetes.io/projected/5fa030ed-f732-420e-844c-db1e7c668faa-kube-api-access-zh5d8\") pod \"isvc-primary-0d7654-predictor-ff757f45b-xphvm\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.823586 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.823543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:34.946385 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:34.946356 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm"] Apr 16 22:44:34.949812 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:44:34.949780 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa030ed_f732_420e_844c_db1e7c668faa.slice/crio-c02f6b5502ca1945d165e7e37799dbe588af00b2275103f1a3d4cf3f662cde60 WatchSource:0}: Error finding container c02f6b5502ca1945d165e7e37799dbe588af00b2275103f1a3d4cf3f662cde60: Status 404 returned error can't find the container with id c02f6b5502ca1945d165e7e37799dbe588af00b2275103f1a3d4cf3f662cde60 Apr 16 22:44:35.595989 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:35.595954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerStarted","Data":"2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95"} Apr 16 22:44:35.595989 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:35.595992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerStarted","Data":"c02f6b5502ca1945d165e7e37799dbe588af00b2275103f1a3d4cf3f662cde60"} Apr 16 22:44:36.322899 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:36.322854 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 16 22:44:36.326185 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:36.326155 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 22:44:38.038476 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.038453 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:44:38.095040 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.094977 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dda0514-d906-4dd9-8cb5-4520e36ca894-kserve-provision-location\") pod \"3dda0514-d906-4dd9-8cb5-4520e36ca894\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " Apr 16 22:44:38.095178 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.095062 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dda0514-d906-4dd9-8cb5-4520e36ca894-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"3dda0514-d906-4dd9-8cb5-4520e36ca894\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " Apr 16 22:44:38.095178 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.095124 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-296sx\" (UniqueName: \"kubernetes.io/projected/3dda0514-d906-4dd9-8cb5-4520e36ca894-kube-api-access-296sx\") pod \"3dda0514-d906-4dd9-8cb5-4520e36ca894\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " Apr 16 22:44:38.095178 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.095151 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dda0514-d906-4dd9-8cb5-4520e36ca894-proxy-tls\") pod \"3dda0514-d906-4dd9-8cb5-4520e36ca894\" (UID: \"3dda0514-d906-4dd9-8cb5-4520e36ca894\") " Apr 16 22:44:38.095371 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.095319 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dda0514-d906-4dd9-8cb5-4520e36ca894-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3dda0514-d906-4dd9-8cb5-4520e36ca894" (UID: "3dda0514-d906-4dd9-8cb5-4520e36ca894"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:44:38.095493 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.095469 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dda0514-d906-4dd9-8cb5-4520e36ca894-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "3dda0514-d906-4dd9-8cb5-4520e36ca894" (UID: "3dda0514-d906-4dd9-8cb5-4520e36ca894"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:44:38.097067 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.097040 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dda0514-d906-4dd9-8cb5-4520e36ca894-kube-api-access-296sx" (OuterVolumeSpecName: "kube-api-access-296sx") pod "3dda0514-d906-4dd9-8cb5-4520e36ca894" (UID: "3dda0514-d906-4dd9-8cb5-4520e36ca894"). InnerVolumeSpecName "kube-api-access-296sx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:44:38.097183 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.097157 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dda0514-d906-4dd9-8cb5-4520e36ca894-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3dda0514-d906-4dd9-8cb5-4520e36ca894" (UID: "3dda0514-d906-4dd9-8cb5-4520e36ca894"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:44:38.196699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.196669 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dda0514-d906-4dd9-8cb5-4520e36ca894-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:44:38.196699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.196696 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dda0514-d906-4dd9-8cb5-4520e36ca894-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:44:38.196854 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.196707 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-296sx\" (UniqueName: \"kubernetes.io/projected/3dda0514-d906-4dd9-8cb5-4520e36ca894-kube-api-access-296sx\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:44:38.196854 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.196718 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dda0514-d906-4dd9-8cb5-4520e36ca894-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:44:38.607726 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.607694 2576 generic.go:358] "Generic (PLEG): container finished" podID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerID="08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3" exitCode=0 Apr 16 22:44:38.607929 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.607767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerDied","Data":"08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3"} Apr 16 22:44:38.607929 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.607800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" event={"ID":"3dda0514-d906-4dd9-8cb5-4520e36ca894","Type":"ContainerDied","Data":"c27014c7d96bc1b72a312981caa9169a46f7381872bff80f055ded1b1cc93092"} Apr 16 22:44:38.607929 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.607819 2576 scope.go:117] "RemoveContainer" containerID="dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177" Apr 16 22:44:38.607929 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.607771 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s" Apr 16 22:44:38.616071 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.616048 2576 scope.go:117] "RemoveContainer" containerID="08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3" Apr 16 22:44:38.622955 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.622938 2576 scope.go:117] "RemoveContainer" containerID="89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2" Apr 16 22:44:38.629589 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.629572 2576 scope.go:117] "RemoveContainer" containerID="dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177" Apr 16 22:44:38.629834 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:44:38.629814 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177\": container with ID starting with dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177 not found: ID does not exist" containerID="dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177" Apr 16 22:44:38.629912 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.629849 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177"} err="failed to get container status \"dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177\": rpc error: code = NotFound desc = could not find container \"dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177\": container with ID starting with dcaa1992b51b3084cd6b7d2418cf0e27b1d8be8e7e7e713dca95539f8f01f177 not found: ID does not exist" Apr 16 22:44:38.629912 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.629877 2576 scope.go:117] "RemoveContainer" containerID="08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3" Apr 16 22:44:38.630180 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:44:38.630160 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3\": container with ID starting with 08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3 not found: ID does not exist" containerID="08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3" Apr 16 22:44:38.630233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.630185 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3"} err="failed to get container status \"08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3\": rpc error: code = NotFound desc = could not find container \"08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3\": container with ID starting with 08fed80213cf504842cee76d083c220bab3b7ea891e53f189808b56b020e34c3 not found: ID does not exist" Apr 16 22:44:38.630233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.630202 2576 scope.go:117] "RemoveContainer" containerID="89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2" Apr 16 22:44:38.630474 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:44:38.630455 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2\": container with ID starting with 89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2 not found: ID does not exist" containerID="89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2" Apr 16 22:44:38.630542 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.630482 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2"} err="failed to get container status \"89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2\": rpc error: code = NotFound desc = could not find container \"89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2\": container with ID starting with 89b5f5269e3b430635e92fd9c0f2031a0c88719f1439bad4348df83056d4f7f2 not found: ID does not exist" Apr 16 22:44:38.631218 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.631195 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s"] Apr 16 22:44:38.634879 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:38.634858 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vzp5s"] Apr 16 22:44:39.452864 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:39.452832 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" path="/var/lib/kubelet/pods/3dda0514-d906-4dd9-8cb5-4520e36ca894/volumes" Apr 16 22:44:39.614682 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:39.614650 2576 generic.go:358] "Generic (PLEG): container finished" podID="5fa030ed-f732-420e-844c-db1e7c668faa" containerID="2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95" exitCode=0 Apr 16 22:44:39.614869 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:39.614727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerDied","Data":"2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95"} Apr 16 22:44:40.620358 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:40.620306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerStarted","Data":"aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f"} Apr 16 22:44:40.620358 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:40.620363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerStarted","Data":"35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa"} Apr 16 22:44:40.620868 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:40.620567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:40.638131 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:40.638086 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podStartSLOduration=6.638068897 podStartE2EDuration="6.638068897s" podCreationTimestamp="2026-04-16 22:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:44:40.637034163 +0000 UTC m=+1859.761658461" watchObservedRunningTime="2026-04-16 22:44:40.638068897 +0000 UTC m=+1859.762693184" Apr 16 22:44:41.623550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:41.623515 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:41.624711 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:41.624680 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:44:42.627774 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:42.627734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:44:47.632006 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:47.631978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:44:47.632619 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:47.632588 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:44:57.633066 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:44:57.633028 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:45:07.632615 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:07.632529 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:45:17.632835 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:17.632793 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:45:27.633265 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:27.633223 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:45:37.633539 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:37.633498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 22:45:47.634089 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:47.634058 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:45:54.621703 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.621663 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r"] Apr 16 22:45:54.622263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622116 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kube-rbac-proxy" Apr 16 22:45:54.622263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622136 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kube-rbac-proxy" Apr 16 22:45:54.622263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622151 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" Apr 16 22:45:54.622263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622160 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" Apr 16 22:45:54.622263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622181 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="storage-initializer" Apr 16 22:45:54.622263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622191 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="storage-initializer" Apr 16 22:45:54.622629 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622276 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kube-rbac-proxy" Apr 16 22:45:54.622629 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.622288 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dda0514-d906-4dd9-8cb5-4520e36ca894" containerName="kserve-container" Apr 16 22:45:54.625550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.625530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.627832 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.627807 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-0d7654-kube-rbac-proxy-sar-config\"" Apr 16 22:45:54.627953 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.627851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-0d7654\"" Apr 16 22:45:54.627953 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.627851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-0d7654-dockercfg-9rzpq\"" Apr 16 22:45:54.627953 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.627904 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-0d7654-predictor-serving-cert\"" Apr 16 22:45:54.628120 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.628079 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 22:45:54.634968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.634944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r"] Apr 16 22:45:54.711387 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.711357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-cabundle-cert\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.711558 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.711406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a502f95-1664-41fd-9531-71b9492f8b69-proxy-tls\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.711558 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.711427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a502f95-1664-41fd-9531-71b9492f8b69-kserve-provision-location\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.711558 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.711459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-isvc-secondary-0d7654-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.711558 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.711534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xqtk\" (UniqueName: \"kubernetes.io/projected/4a502f95-1664-41fd-9531-71b9492f8b69-kube-api-access-2xqtk\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812094 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-cabundle-cert\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a502f95-1664-41fd-9531-71b9492f8b69-proxy-tls\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a502f95-1664-41fd-9531-71b9492f8b69-kserve-provision-location\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-isvc-secondary-0d7654-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xqtk\" (UniqueName: \"kubernetes.io/projected/4a502f95-1664-41fd-9531-71b9492f8b69-kube-api-access-2xqtk\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812540 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a502f95-1664-41fd-9531-71b9492f8b69-kserve-provision-location\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812827 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-isvc-secondary-0d7654-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.812959 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.812938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-cabundle-cert\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.814672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.814653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a502f95-1664-41fd-9531-71b9492f8b69-proxy-tls\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.820411 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.820388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xqtk\" (UniqueName: \"kubernetes.io/projected/4a502f95-1664-41fd-9531-71b9492f8b69-kube-api-access-2xqtk\") pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:54.936145 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:54.936069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:45:55.064674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:55.064558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r"] Apr 16 22:45:55.067502 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:45:55.067461 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a502f95_1664_41fd_9531_71b9492f8b69.slice/crio-acb74489cc7b8f1b368807b26830e4a70ce6a69af4328ba069ea7dd7d1303e53 WatchSource:0}: Error finding container acb74489cc7b8f1b368807b26830e4a70ce6a69af4328ba069ea7dd7d1303e53: Status 404 returned error can't find the container with id acb74489cc7b8f1b368807b26830e4a70ce6a69af4328ba069ea7dd7d1303e53 Apr 16 22:45:55.868443 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:55.868398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" event={"ID":"4a502f95-1664-41fd-9531-71b9492f8b69","Type":"ContainerStarted","Data":"fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071"} Apr 16 22:45:55.868908 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:55.868448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" event={"ID":"4a502f95-1664-41fd-9531-71b9492f8b69","Type":"ContainerStarted","Data":"acb74489cc7b8f1b368807b26830e4a70ce6a69af4328ba069ea7dd7d1303e53"} Apr 16 22:45:59.883474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:59.883445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_4a502f95-1664-41fd-9531-71b9492f8b69/storage-initializer/0.log" Apr 16 22:45:59.883964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:59.883482 2576 generic.go:358] "Generic (PLEG): container finished" podID="4a502f95-1664-41fd-9531-71b9492f8b69" containerID="fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071" exitCode=1 Apr 16 22:45:59.883964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:45:59.883541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" event={"ID":"4a502f95-1664-41fd-9531-71b9492f8b69","Type":"ContainerDied","Data":"fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071"} Apr 16 22:46:00.888290 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:00.888262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_4a502f95-1664-41fd-9531-71b9492f8b69/storage-initializer/0.log" Apr 16 22:46:00.888700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:00.888376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" event={"ID":"4a502f95-1664-41fd-9531-71b9492f8b69","Type":"ContainerStarted","Data":"ec95d5bfeb6cb1a48aae980a3c041326c9e6c2cf4d57668cd2df9c5d35c3bfb9"} Apr 16 22:46:02.896170 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:02.896131 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_4a502f95-1664-41fd-9531-71b9492f8b69/storage-initializer/1.log" Apr 16 22:46:02.896590 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:02.896531 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_4a502f95-1664-41fd-9531-71b9492f8b69/storage-initializer/0.log" Apr 16 22:46:02.896590 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:02.896562 2576 generic.go:358] "Generic (PLEG): container finished" podID="4a502f95-1664-41fd-9531-71b9492f8b69" containerID="ec95d5bfeb6cb1a48aae980a3c041326c9e6c2cf4d57668cd2df9c5d35c3bfb9" exitCode=1 Apr 16 22:46:02.896699 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:02.896646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" event={"ID":"4a502f95-1664-41fd-9531-71b9492f8b69","Type":"ContainerDied","Data":"ec95d5bfeb6cb1a48aae980a3c041326c9e6c2cf4d57668cd2df9c5d35c3bfb9"} Apr 16 22:46:02.896754 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:02.896702 2576 scope.go:117] "RemoveContainer" containerID="fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071" Apr 16 22:46:02.897102 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:02.897072 2576 scope.go:117] "RemoveContainer" containerID="fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071" Apr 16 22:46:02.907231 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:02.907201 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_kserve-ci-e2e-test_4a502f95-1664-41fd-9531-71b9492f8b69_0 in pod sandbox acb74489cc7b8f1b368807b26830e4a70ce6a69af4328ba069ea7dd7d1303e53 from index: no such id: 'fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071'" containerID="fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071" Apr 16 22:46:02.907302 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:02.907257 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_kserve-ci-e2e-test_4a502f95-1664-41fd-9531-71b9492f8b69_0 in pod sandbox acb74489cc7b8f1b368807b26830e4a70ce6a69af4328ba069ea7dd7d1303e53 from index: no such id: 'fe46c57c932ec4cda7c12cffe0322c1a71c44af76128435f4df70538356ad071'; Skipping pod \"isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_kserve-ci-e2e-test(4a502f95-1664-41fd-9531-71b9492f8b69)\"" logger="UnhandledError" Apr 16 22:46:02.908669 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:02.908647 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_kserve-ci-e2e-test(4a502f95-1664-41fd-9531-71b9492f8b69)\"" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" Apr 16 22:46:03.900713 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:03.900685 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_4a502f95-1664-41fd-9531-71b9492f8b69/storage-initializer/1.log" Apr 16 22:46:10.673214 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.673180 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r"] Apr 16 22:46:10.721624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.721583 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm"] Apr 16 22:46:10.722471 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.721956 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" containerID="cri-o://35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa" gracePeriod=30 Apr 16 22:46:10.722471 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.722023 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kube-rbac-proxy" containerID="cri-o://aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f" gracePeriod=30 Apr 16 22:46:10.793946 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.793910 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg"] Apr 16 22:46:10.798890 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.798858 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.801166 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.801142 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-241fac-predictor-serving-cert\"" Apr 16 22:46:10.801314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.801218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-241fac-dockercfg-4sts5\"" Apr 16 22:46:10.801415 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.801397 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-241fac-kube-rbac-proxy-sar-config\"" Apr 16 22:46:10.801480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.801397 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-241fac\"" Apr 16 22:46:10.807817 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.807790 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg"] Apr 16 22:46:10.843363 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.843319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_4a502f95-1664-41fd-9531-71b9492f8b69/storage-initializer/1.log" Apr 16 22:46:10.843489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.843405 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:46:10.845721 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.845703 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xqtk\" (UniqueName: \"kubernetes.io/projected/4a502f95-1664-41fd-9531-71b9492f8b69-kube-api-access-2xqtk\") pod \"4a502f95-1664-41fd-9531-71b9492f8b69\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " Apr 16 22:46:10.845772 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.845735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-isvc-secondary-0d7654-kube-rbac-proxy-sar-config\") pod \"4a502f95-1664-41fd-9531-71b9492f8b69\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " Apr 16 22:46:10.845772 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.845765 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-cabundle-cert\") pod \"4a502f95-1664-41fd-9531-71b9492f8b69\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " Apr 16 22:46:10.845875 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.845780 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a502f95-1664-41fd-9531-71b9492f8b69-kserve-provision-location\") pod \"4a502f95-1664-41fd-9531-71b9492f8b69\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " Apr 16 22:46:10.845875 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.845856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grv7n\" (UniqueName: \"kubernetes.io/projected/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kube-api-access-grv7n\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.845985 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.845897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-241fac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-isvc-init-fail-241fac-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.845985 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.845940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kserve-provision-location\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.846104 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-cabundle-cert\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.846104 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846078 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a502f95-1664-41fd-9531-71b9492f8b69-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a502f95-1664-41fd-9531-71b9492f8b69" (UID: "4a502f95-1664-41fd-9531-71b9492f8b69"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:46:10.846191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846105 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-isvc-secondary-0d7654-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-0d7654-kube-rbac-proxy-sar-config") pod "4a502f95-1664-41fd-9531-71b9492f8b69" (UID: "4a502f95-1664-41fd-9531-71b9492f8b69"). InnerVolumeSpecName "isvc-secondary-0d7654-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:10.846191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.846191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846120 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4a502f95-1664-41fd-9531-71b9492f8b69" (UID: "4a502f95-1664-41fd-9531-71b9492f8b69"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:10.846191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846183 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-cabundle-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:10.846348 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846196 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a502f95-1664-41fd-9531-71b9492f8b69-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:10.846348 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.846207 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a502f95-1664-41fd-9531-71b9492f8b69-isvc-secondary-0d7654-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:10.848118 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.848100 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a502f95-1664-41fd-9531-71b9492f8b69-kube-api-access-2xqtk" (OuterVolumeSpecName: "kube-api-access-2xqtk") pod "4a502f95-1664-41fd-9531-71b9492f8b69" (UID: "4a502f95-1664-41fd-9531-71b9492f8b69"). InnerVolumeSpecName "kube-api-access-2xqtk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:46:10.926639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.926555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0d7654-predictor-6647fcc567-bpj6r_4a502f95-1664-41fd-9531-71b9492f8b69/storage-initializer/1.log" Apr 16 22:46:10.926831 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.926662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" event={"ID":"4a502f95-1664-41fd-9531-71b9492f8b69","Type":"ContainerDied","Data":"acb74489cc7b8f1b368807b26830e4a70ce6a69af4328ba069ea7dd7d1303e53"} Apr 16 22:46:10.926831 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.926686 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r" Apr 16 22:46:10.926831 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.926704 2576 scope.go:117] "RemoveContainer" containerID="ec95d5bfeb6cb1a48aae980a3c041326c9e6c2cf4d57668cd2df9c5d35c3bfb9" Apr 16 22:46:10.928667 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.928638 2576 generic.go:358] "Generic (PLEG): container finished" podID="5fa030ed-f732-420e-844c-db1e7c668faa" containerID="aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f" exitCode=2 Apr 16 22:46:10.928770 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.928696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerDied","Data":"aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f"} Apr 16 22:46:10.946817 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.946794 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a502f95-1664-41fd-9531-71b9492f8b69-proxy-tls\") pod \"4a502f95-1664-41fd-9531-71b9492f8b69\" (UID: \"4a502f95-1664-41fd-9531-71b9492f8b69\") " Apr 16 22:46:10.946896 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.946880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv7n\" (UniqueName: \"kubernetes.io/projected/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kube-api-access-grv7n\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.946947 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.946912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-241fac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-isvc-init-fail-241fac-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.946947 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.946937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kserve-provision-location\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.947051 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.946962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-cabundle-cert\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.947051 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.947003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.947156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.947061 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2xqtk\" (UniqueName: \"kubernetes.io/projected/4a502f95-1664-41fd-9531-71b9492f8b69-kube-api-access-2xqtk\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:10.947156 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:10.947148 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-serving-cert: secret "isvc-init-fail-241fac-predictor-serving-cert" not found Apr 16 22:46:10.947264 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:10.947205 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls podName:58a51b10-a9df-4d2e-a933-a40d4cd327f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:46:11.447187418 +0000 UTC m=+1950.571811683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls") pod "isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" (UID: "58a51b10-a9df-4d2e-a933-a40d4cd327f1") : secret "isvc-init-fail-241fac-predictor-serving-cert" not found Apr 16 22:46:10.947450 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.947429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kserve-provision-location\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.947735 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.947716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-cabundle-cert\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.947782 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.947729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-241fac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-isvc-init-fail-241fac-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:10.949061 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.949040 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a502f95-1664-41fd-9531-71b9492f8b69-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4a502f95-1664-41fd-9531-71b9492f8b69" (UID: "4a502f95-1664-41fd-9531-71b9492f8b69"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:46:10.955032 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:10.955009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grv7n\" (UniqueName: \"kubernetes.io/projected/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kube-api-access-grv7n\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:11.048248 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.048216 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a502f95-1664-41fd-9531-71b9492f8b69-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:11.262876 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.262841 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r"] Apr 16 22:46:11.268829 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.268803 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0d7654-predictor-6647fcc567-bpj6r"] Apr 16 22:46:11.451685 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.451659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:11.453315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.453286 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" path="/var/lib/kubelet/pods/4a502f95-1664-41fd-9531-71b9492f8b69/volumes" Apr 16 22:46:11.454079 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.454060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls\") pod \"isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:11.718509 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.718479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:11.839268 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.839234 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg"] Apr 16 22:46:11.842385 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:46:11.842351 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58a51b10_a9df_4d2e_a933_a40d4cd327f1.slice/crio-691519c88647ba0bf84a19172fb2de523b8297b372e8bb2df1dd59978c720382 WatchSource:0}: Error finding container 691519c88647ba0bf84a19172fb2de523b8297b372e8bb2df1dd59978c720382: Status 404 returned error can't find the container with id 691519c88647ba0bf84a19172fb2de523b8297b372e8bb2df1dd59978c720382 Apr 16 22:46:11.933492 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.933458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" event={"ID":"58a51b10-a9df-4d2e-a933-a40d4cd327f1","Type":"ContainerStarted","Data":"4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02"} Apr 16 22:46:11.933671 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:11.933501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" event={"ID":"58a51b10-a9df-4d2e-a933-a40d4cd327f1","Type":"ContainerStarted","Data":"691519c88647ba0bf84a19172fb2de523b8297b372e8bb2df1dd59978c720382"} Apr 16 22:46:12.628600 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:12.628554 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 16 22:46:15.165703 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.165680 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:46:15.179125 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.179096 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fa030ed-f732-420e-844c-db1e7c668faa-kserve-provision-location\") pod \"5fa030ed-f732-420e-844c-db1e7c668faa\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " Apr 16 22:46:15.179290 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.179144 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5fa030ed-f732-420e-844c-db1e7c668faa-isvc-primary-0d7654-kube-rbac-proxy-sar-config\") pod \"5fa030ed-f732-420e-844c-db1e7c668faa\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " Apr 16 22:46:15.179290 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.179165 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh5d8\" (UniqueName: \"kubernetes.io/projected/5fa030ed-f732-420e-844c-db1e7c668faa-kube-api-access-zh5d8\") pod \"5fa030ed-f732-420e-844c-db1e7c668faa\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " Apr 16 22:46:15.179290 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.179246 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fa030ed-f732-420e-844c-db1e7c668faa-proxy-tls\") pod \"5fa030ed-f732-420e-844c-db1e7c668faa\" (UID: \"5fa030ed-f732-420e-844c-db1e7c668faa\") " Apr 16 22:46:15.179550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.179525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fa030ed-f732-420e-844c-db1e7c668faa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5fa030ed-f732-420e-844c-db1e7c668faa" (UID: "5fa030ed-f732-420e-844c-db1e7c668faa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:46:15.179613 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.179533 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fa030ed-f732-420e-844c-db1e7c668faa-isvc-primary-0d7654-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-0d7654-kube-rbac-proxy-sar-config") pod "5fa030ed-f732-420e-844c-db1e7c668faa" (UID: "5fa030ed-f732-420e-844c-db1e7c668faa"). InnerVolumeSpecName "isvc-primary-0d7654-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:15.181406 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.181382 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa030ed-f732-420e-844c-db1e7c668faa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5fa030ed-f732-420e-844c-db1e7c668faa" (UID: "5fa030ed-f732-420e-844c-db1e7c668faa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:46:15.181507 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.181446 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa030ed-f732-420e-844c-db1e7c668faa-kube-api-access-zh5d8" (OuterVolumeSpecName: "kube-api-access-zh5d8") pod "5fa030ed-f732-420e-844c-db1e7c668faa" (UID: "5fa030ed-f732-420e-844c-db1e7c668faa"). InnerVolumeSpecName "kube-api-access-zh5d8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:46:15.280308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.280216 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fa030ed-f732-420e-844c-db1e7c668faa-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.280308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.280252 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fa030ed-f732-420e-844c-db1e7c668faa-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.280308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.280264 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-0d7654-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5fa030ed-f732-420e-844c-db1e7c668faa-isvc-primary-0d7654-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.280308 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.280276 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zh5d8\" (UniqueName: \"kubernetes.io/projected/5fa030ed-f732-420e-844c-db1e7c668faa-kube-api-access-zh5d8\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.948790 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.948754 2576 generic.go:358] "Generic (PLEG): container finished" podID="5fa030ed-f732-420e-844c-db1e7c668faa" containerID="35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa" exitCode=0 Apr 16 22:46:15.948984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.948836 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" Apr 16 22:46:15.948984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.948836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerDied","Data":"35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa"} Apr 16 22:46:15.948984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.948879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm" event={"ID":"5fa030ed-f732-420e-844c-db1e7c668faa","Type":"ContainerDied","Data":"c02f6b5502ca1945d165e7e37799dbe588af00b2275103f1a3d4cf3f662cde60"} Apr 16 22:46:15.948984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.948901 2576 scope.go:117] "RemoveContainer" containerID="aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f" Apr 16 22:46:15.956979 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.956940 2576 scope.go:117] "RemoveContainer" containerID="35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa" Apr 16 22:46:15.964311 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.964289 2576 scope.go:117] "RemoveContainer" containerID="2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95" Apr 16 22:46:15.966942 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.966917 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm"] Apr 16 22:46:15.970943 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.970920 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0d7654-predictor-ff757f45b-xphvm"] Apr 16 22:46:15.972984 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.972964 2576 scope.go:117] "RemoveContainer" containerID="aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f" Apr 16 22:46:15.973294 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:15.973267 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f\": container with ID starting with aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f not found: ID does not exist" containerID="aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f" Apr 16 22:46:15.973385 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.973295 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f"} err="failed to get container status \"aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f\": rpc error: code = NotFound desc = could not find container \"aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f\": container with ID starting with aaf2cf1ea67f67d64d576e08d9e1a57d4665fff3aa9584942267a3c5bbb21d9f not found: ID does not exist" Apr 16 22:46:15.973385 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.973316 2576 scope.go:117] "RemoveContainer" containerID="35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa" Apr 16 22:46:15.973595 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:15.973574 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa\": container with ID starting with 35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa not found: ID does not exist" containerID="35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa" Apr 16 22:46:15.973643 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.973602 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa"} err="failed to get container status \"35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa\": rpc error: code = NotFound desc = could not find container \"35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa\": container with ID starting with 35f2c4252edbe24a5eea6fb1878afc28cc8d455be33c97bdfd193b4f957063fa not found: ID does not exist" Apr 16 22:46:15.973643 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.973618 2576 scope.go:117] "RemoveContainer" containerID="2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95" Apr 16 22:46:15.973883 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:15.973864 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95\": container with ID starting with 2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95 not found: ID does not exist" containerID="2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95" Apr 16 22:46:15.973946 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:15.973904 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95"} err="failed to get container status \"2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95\": rpc error: code = NotFound desc = could not find container \"2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95\": container with ID starting with 2ddaad143ea8d36029c4936b5c0759f93c463118953dc5f180543e3935974d95 not found: ID does not exist" Apr 16 22:46:16.954389 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:16.954363 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg_58a51b10-a9df-4d2e-a933-a40d4cd327f1/storage-initializer/0.log" Apr 16 22:46:16.954860 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:16.954401 2576 generic.go:358] "Generic (PLEG): container finished" podID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerID="4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02" exitCode=1 Apr 16 22:46:16.954860 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:16.954440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" event={"ID":"58a51b10-a9df-4d2e-a933-a40d4cd327f1","Type":"ContainerDied","Data":"4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02"} Apr 16 22:46:17.454492 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:17.454461 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" path="/var/lib/kubelet/pods/5fa030ed-f732-420e-844c-db1e7c668faa/volumes" Apr 16 22:46:17.959681 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:17.959658 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg_58a51b10-a9df-4d2e-a933-a40d4cd327f1/storage-initializer/0.log" Apr 16 22:46:17.960090 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:17.959774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" event={"ID":"58a51b10-a9df-4d2e-a933-a40d4cd327f1","Type":"ContainerStarted","Data":"77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52"} Apr 16 22:46:20.793184 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.793095 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg"] Apr 16 22:46:20.793653 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.793458 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerName="storage-initializer" containerID="cri-o://77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52" gracePeriod=30 Apr 16 22:46:20.910148 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910116 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv"] Apr 16 22:46:20.910505 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910488 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" containerName="storage-initializer" Apr 16 22:46:20.910645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910508 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" containerName="storage-initializer" Apr 16 22:46:20.910645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910526 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="storage-initializer" Apr 16 22:46:20.910645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910534 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="storage-initializer" Apr 16 22:46:20.910645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910552 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kube-rbac-proxy" Apr 16 22:46:20.910645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910561 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kube-rbac-proxy" Apr 16 22:46:20.910645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910583 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" Apr 16 22:46:20.910645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910590 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" Apr 16 22:46:20.911017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910667 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kserve-container" Apr 16 22:46:20.911017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910683 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" containerName="storage-initializer" Apr 16 22:46:20.911017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910696 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" containerName="storage-initializer" Apr 16 22:46:20.911017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910707 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fa030ed-f732-420e-844c-db1e7c668faa" containerName="kube-rbac-proxy" Apr 16 22:46:20.911017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910781 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" containerName="storage-initializer" Apr 16 22:46:20.911017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.910791 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a502f95-1664-41fd-9531-71b9492f8b69" containerName="storage-initializer" Apr 16 22:46:20.914108 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.914083 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:20.916421 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.916394 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-f9tz9\"" Apr 16 22:46:20.916421 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.916405 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 16 22:46:20.916607 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.916405 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 16 22:46:20.922610 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.922577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv"] Apr 16 22:46:20.940930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.940904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg_58a51b10-a9df-4d2e-a933-a40d4cd327f1/storage-initializer/1.log" Apr 16 22:46:20.941376 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.941354 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg_58a51b10-a9df-4d2e-a933-a40d4cd327f1/storage-initializer/0.log" Apr 16 22:46:20.941512 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.941422 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:20.968935 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.968908 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg_58a51b10-a9df-4d2e-a933-a40d4cd327f1/storage-initializer/1.log" Apr 16 22:46:20.969320 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.969300 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg_58a51b10-a9df-4d2e-a933-a40d4cd327f1/storage-initializer/0.log" Apr 16 22:46:20.969454 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.969355 2576 generic.go:358] "Generic (PLEG): container finished" podID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerID="77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52" exitCode=1 Apr 16 22:46:20.969454 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.969390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" event={"ID":"58a51b10-a9df-4d2e-a933-a40d4cd327f1","Type":"ContainerDied","Data":"77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52"} Apr 16 22:46:20.969454 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.969428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" event={"ID":"58a51b10-a9df-4d2e-a933-a40d4cd327f1","Type":"ContainerDied","Data":"691519c88647ba0bf84a19172fb2de523b8297b372e8bb2df1dd59978c720382"} Apr 16 22:46:20.969454 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.969445 2576 scope.go:117] "RemoveContainer" containerID="77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52" Apr 16 22:46:20.969661 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.969445 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg" Apr 16 22:46:20.977242 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.977224 2576 scope.go:117] "RemoveContainer" containerID="4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02" Apr 16 22:46:20.985025 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.985006 2576 scope.go:117] "RemoveContainer" containerID="77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52" Apr 16 22:46:20.985286 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:20.985266 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52\": container with ID starting with 77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52 not found: ID does not exist" containerID="77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52" Apr 16 22:46:20.985386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.985296 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52"} err="failed to get container status \"77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52\": rpc error: code = NotFound desc = could not find container \"77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52\": container with ID starting with 77a1efd254d72be925343b21ba775c14d913f7c90f44b93e4821963dd6ecab52 not found: ID does not exist" Apr 16 22:46:20.985386 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.985318 2576 scope.go:117] "RemoveContainer" containerID="4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02" Apr 16 22:46:20.985578 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:20.985563 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02\": container with ID starting with 4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02 not found: ID does not exist" containerID="4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02" Apr 16 22:46:20.985622 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:20.985582 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02"} err="failed to get container status \"4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02\": rpc error: code = NotFound desc = could not find container \"4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02\": container with ID starting with 4257484216a6bc7d424f8b696836e94414d8255569bf34ec2460ccb9fc909c02 not found: ID does not exist" Apr 16 22:46:21.023634 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023599 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-241fac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-isvc-init-fail-241fac-kube-rbac-proxy-sar-config\") pod \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " Apr 16 22:46:21.023833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023643 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grv7n\" (UniqueName: \"kubernetes.io/projected/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kube-api-access-grv7n\") pod \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " Apr 16 22:46:21.023833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023664 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kserve-provision-location\") pod \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " Apr 16 22:46:21.023833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023715 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls\") pod \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " Apr 16 22:46:21.023833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023751 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-cabundle-cert\") pod \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\" (UID: \"58a51b10-a9df-4d2e-a933-a40d4cd327f1\") " Apr 16 22:46:21.023833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.024098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ppt\" (UniqueName: \"kubernetes.io/projected/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kube-api-access-j6ppt\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.024098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.024098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023972 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "58a51b10-a9df-4d2e-a933-a40d4cd327f1" (UID: "58a51b10-a9df-4d2e-a933-a40d4cd327f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:46:21.024098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.023997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.024098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.024042 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-isvc-init-fail-241fac-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-241fac-kube-rbac-proxy-sar-config") pod "58a51b10-a9df-4d2e-a933-a40d4cd327f1" (UID: "58a51b10-a9df-4d2e-a933-a40d4cd327f1"). InnerVolumeSpecName "isvc-init-fail-241fac-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:21.024098 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.024085 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:21.024317 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.024273 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "58a51b10-a9df-4d2e-a933-a40d4cd327f1" (UID: "58a51b10-a9df-4d2e-a933-a40d4cd327f1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:21.025916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.025892 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "58a51b10-a9df-4d2e-a933-a40d4cd327f1" (UID: "58a51b10-a9df-4d2e-a933-a40d4cd327f1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:46:21.025916 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.025904 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kube-api-access-grv7n" (OuterVolumeSpecName: "kube-api-access-grv7n") pod "58a51b10-a9df-4d2e-a933-a40d4cd327f1" (UID: "58a51b10-a9df-4d2e-a933-a40d4cd327f1"). InnerVolumeSpecName "kube-api-access-grv7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:46:21.124735 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.124735 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ppt\" (UniqueName: \"kubernetes.io/projected/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kube-api-access-j6ppt\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.124735 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.125005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.125005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124838 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58a51b10-a9df-4d2e-a933-a40d4cd327f1-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:21.125005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124854 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-cabundle-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:21.125005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124868 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-241fac-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58a51b10-a9df-4d2e-a933-a40d4cd327f1-isvc-init-fail-241fac-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:21.125005 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.124882 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grv7n\" (UniqueName: \"kubernetes.io/projected/58a51b10-a9df-4d2e-a933-a40d4cd327f1-kube-api-access-grv7n\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:46:21.125005 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:21.124974 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 16 22:46:21.125229 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:46:21.125053 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls podName:a97c07b9-b0ca-452b-8a70-d2d55b27f8f5 nodeName:}" failed. No retries permitted until 2026-04-16 22:46:21.625030449 +0000 UTC m=+1960.749654714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" (UID: "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 16 22:46:21.125229 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.125122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.125424 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.125406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.136952 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.136927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ppt\" (UniqueName: \"kubernetes.io/projected/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kube-api-access-j6ppt\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.305847 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.305820 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg"] Apr 16 22:46:21.310246 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.310216 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-241fac-predictor-79dffbcd5f-c5zcg"] Apr 16 22:46:21.452457 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.452417 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" path="/var/lib/kubelet/pods/58a51b10-a9df-4d2e-a933-a40d4cd327f1/volumes" Apr 16 22:46:21.627877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.627847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.630253 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.630235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.840074 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.839974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:21.959930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.959896 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv"] Apr 16 22:46:21.963704 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:46:21.963672 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda97c07b9_b0ca_452b_8a70_d2d55b27f8f5.slice/crio-304f62c8473eadf5321d4635109bb7181c6839158e60f424cca0cde1b4169b1f WatchSource:0}: Error finding container 304f62c8473eadf5321d4635109bb7181c6839158e60f424cca0cde1b4169b1f: Status 404 returned error can't find the container with id 304f62c8473eadf5321d4635109bb7181c6839158e60f424cca0cde1b4169b1f Apr 16 22:46:21.973742 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:21.973718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerStarted","Data":"304f62c8473eadf5321d4635109bb7181c6839158e60f424cca0cde1b4169b1f"} Apr 16 22:46:22.979241 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:22.979207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerStarted","Data":"0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007"} Apr 16 22:46:25.990828 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:25.990794 2576 generic.go:358] "Generic (PLEG): container finished" podID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerID="0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007" exitCode=0 Apr 16 22:46:25.991213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:25.990865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerDied","Data":"0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007"} Apr 16 22:46:45.898021 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:45.898000 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:46:46.068821 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:46.068786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerStarted","Data":"1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed"} Apr 16 22:46:46.068821 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:46.068825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerStarted","Data":"970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee"} Apr 16 22:46:46.069033 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:46.068990 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:46.087280 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:46.087237 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podStartSLOduration=6.329672365 podStartE2EDuration="26.087224245s" podCreationTimestamp="2026-04-16 22:46:20 +0000 UTC" firstStartedPulling="2026-04-16 22:46:25.991995094 +0000 UTC m=+1965.116619359" lastFinishedPulling="2026-04-16 22:46:45.749546975 +0000 UTC m=+1984.874171239" observedRunningTime="2026-04-16 22:46:46.086091472 +0000 UTC m=+1985.210715769" watchObservedRunningTime="2026-04-16 22:46:46.087224245 +0000 UTC m=+1985.211848533" Apr 16 22:46:47.072121 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:47.072078 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:47.073146 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:47.073119 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:46:48.075710 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:48.075669 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:46:53.081170 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:53.081139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:46:53.081643 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:46:53.081616 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:47:03.081719 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:47:03.081678 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:47:13.081754 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:47:13.081699 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:47:23.081545 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:47:23.081505 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:47:33.082343 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:47:33.082281 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:47:43.082032 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:47:43.081988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:47:53.082297 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:47:53.082257 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:48:03.082611 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:03.082534 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:48:11.070531 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.070498 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv"] Apr 16 22:48:11.070913 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.070852 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" containerID="cri-o://970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee" gracePeriod=30 Apr 16 22:48:11.070913 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.070897 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kube-rbac-proxy" containerID="cri-o://1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed" gracePeriod=30 Apr 16 22:48:11.214830 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.214787 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t"] Apr 16 22:48:11.215174 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.215155 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerName="storage-initializer" Apr 16 22:48:11.215256 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.215177 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerName="storage-initializer" Apr 16 22:48:11.215256 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.215188 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerName="storage-initializer" Apr 16 22:48:11.215256 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.215196 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerName="storage-initializer" Apr 16 22:48:11.215441 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.215295 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerName="storage-initializer" Apr 16 22:48:11.215441 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.215309 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="58a51b10-a9df-4d2e-a933-a40d4cd327f1" containerName="storage-initializer" Apr 16 22:48:11.218228 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.218205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.221180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.221152 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 16 22:48:11.221277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.221160 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 16 22:48:11.229284 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.229262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t"] Apr 16 22:48:11.335354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.335249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.335354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.335291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.335556 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.335376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwn4z\" (UniqueName: \"kubernetes.io/projected/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kube-api-access-gwn4z\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.335556 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.335408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.340002 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.339969 2576 generic.go:358] "Generic (PLEG): container finished" podID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerID="1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed" exitCode=2 Apr 16 22:48:11.340138 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.340046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerDied","Data":"1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed"} Apr 16 22:48:11.436790 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.436749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.436974 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.436810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.436974 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.436859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwn4z\" (UniqueName: \"kubernetes.io/projected/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kube-api-access-gwn4z\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.437163 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.437051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.437422 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.437386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.437422 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.437410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.439337 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.439309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.446527 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.446507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwn4z\" (UniqueName: \"kubernetes.io/projected/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kube-api-access-gwn4z\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.529024 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.528993 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:11.651071 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:11.651044 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t"] Apr 16 22:48:11.652936 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:48:11.652908 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07dac8a7_77c8_4a70_8fbe_6c5141897bdd.slice/crio-16a020d688ece2131a4a6f3e92c328db1475dcc48c717d81246fc1144a713a58 WatchSource:0}: Error finding container 16a020d688ece2131a4a6f3e92c328db1475dcc48c717d81246fc1144a713a58: Status 404 returned error can't find the container with id 16a020d688ece2131a4a6f3e92c328db1475dcc48c717d81246fc1144a713a58 Apr 16 22:48:12.346062 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:12.346028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerStarted","Data":"f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc"} Apr 16 22:48:12.346483 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:12.346065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerStarted","Data":"16a020d688ece2131a4a6f3e92c328db1475dcc48c717d81246fc1144a713a58"} Apr 16 22:48:13.076096 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:13.076052 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 16 22:48:13.081627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:13.081587 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 22:48:15.359081 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:15.359051 2576 generic.go:358] "Generic (PLEG): container finished" podID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerID="f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc" exitCode=0 Apr 16 22:48:15.359440 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:15.359125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerDied","Data":"f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc"} Apr 16 22:48:16.007067 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.007047 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:48:16.175286 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.175201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6ppt\" (UniqueName: \"kubernetes.io/projected/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kube-api-access-j6ppt\") pod \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " Apr 16 22:48:16.175286 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.175241 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls\") pod \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " Apr 16 22:48:16.175286 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.175289 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " Apr 16 22:48:16.175547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.175385 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kserve-provision-location\") pod \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\" (UID: \"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5\") " Apr 16 22:48:16.175779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.175741 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" (UID: "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:48:16.175779 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.175739 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" (UID: "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:48:16.177366 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.177340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" (UID: "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:48:16.177468 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.177419 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kube-api-access-j6ppt" (OuterVolumeSpecName: "kube-api-access-j6ppt") pod "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" (UID: "a97c07b9-b0ca-452b-8a70-d2d55b27f8f5"). InnerVolumeSpecName "kube-api-access-j6ppt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:48:16.276211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.276176 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:48:16.276211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.276208 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6ppt\" (UniqueName: \"kubernetes.io/projected/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-kube-api-access-j6ppt\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:48:16.276211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.276220 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:48:16.276471 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.276230 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:48:16.364010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.363977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerStarted","Data":"b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b"} Apr 16 22:48:16.364448 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.364022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerStarted","Data":"238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4"} Apr 16 22:48:16.364448 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.364357 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:16.364574 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.364498 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:16.365684 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.365659 2576 generic.go:358] "Generic (PLEG): container finished" podID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerID="970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee" exitCode=0 Apr 16 22:48:16.365791 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.365734 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerDied","Data":"970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee"} Apr 16 22:48:16.365791 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.365751 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" Apr 16 22:48:16.365791 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.365765 2576 scope.go:117] "RemoveContainer" containerID="1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed" Apr 16 22:48:16.365949 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.365755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv" event={"ID":"a97c07b9-b0ca-452b-8a70-d2d55b27f8f5","Type":"ContainerDied","Data":"304f62c8473eadf5321d4635109bb7181c6839158e60f424cca0cde1b4169b1f"} Apr 16 22:48:16.366011 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.365939 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:48:16.373747 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.373730 2576 scope.go:117] "RemoveContainer" containerID="970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee" Apr 16 22:48:16.380497 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.380482 2576 scope.go:117] "RemoveContainer" containerID="0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007" Apr 16 22:48:16.382625 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.382589 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podStartSLOduration=5.382576322 podStartE2EDuration="5.382576322s" podCreationTimestamp="2026-04-16 22:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:48:16.380649121 +0000 UTC m=+2075.505273444" watchObservedRunningTime="2026-04-16 22:48:16.382576322 +0000 UTC m=+2075.507200606" Apr 16 22:48:16.387489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.387472 2576 scope.go:117] "RemoveContainer" containerID="1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed" Apr 16 22:48:16.387731 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:48:16.387711 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed\": container with ID starting with 1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed not found: ID does not exist" containerID="1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed" Apr 16 22:48:16.387784 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.387739 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed"} err="failed to get container status \"1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed\": rpc error: code = NotFound desc = could not find container \"1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed\": container with ID starting with 1a716475db43405534c3b2892ae3c73ae292634e06500bfba002e75a262ab3ed not found: ID does not exist" Apr 16 22:48:16.387784 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.387756 2576 scope.go:117] "RemoveContainer" containerID="970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee" Apr 16 22:48:16.387986 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:48:16.387967 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee\": container with ID starting with 970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee not found: ID does not exist" containerID="970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee" Apr 16 22:48:16.388053 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.387998 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee"} err="failed to get container status \"970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee\": rpc error: code = NotFound desc = could not find container \"970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee\": container with ID starting with 970d0a9b374d0f4493fb5649bf7a1f9db9b80d003d172443b4399d154e1e86ee not found: ID does not exist" Apr 16 22:48:16.388053 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.388021 2576 scope.go:117] "RemoveContainer" containerID="0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007" Apr 16 22:48:16.388244 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:48:16.388227 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007\": container with ID starting with 0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007 not found: ID does not exist" containerID="0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007" Apr 16 22:48:16.388283 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.388249 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007"} err="failed to get container status \"0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007\": rpc error: code = NotFound desc = could not find container \"0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007\": container with ID starting with 0a659d32bf3057f2a87733379bf37718fe6ee33f2c6480a12c1de9397be24007 not found: ID does not exist" Apr 16 22:48:16.393986 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.393966 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv"] Apr 16 22:48:16.402234 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:16.402214 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hqwrv"] Apr 16 22:48:17.372674 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:17.372631 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:48:17.452939 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:17.452896 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" path="/var/lib/kubelet/pods/a97c07b9-b0ca-452b-8a70-d2d55b27f8f5/volumes" Apr 16 22:48:22.376584 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:22.376545 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:48:22.377170 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:22.377142 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:48:32.377363 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:32.377301 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:48:42.377192 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:42.377149 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:48:52.377956 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:48:52.377917 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:49:02.377491 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:02.377450 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:49:12.377406 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:12.377363 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:49:22.377562 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:22.377520 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:49:27.449603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:27.449558 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 22:49:37.453486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:37.453411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:49:41.264892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.264854 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t"] Apr 16 22:49:41.265848 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.265811 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" containerID="cri-o://238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4" gracePeriod=30 Apr 16 22:49:41.265998 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.265892 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kube-rbac-proxy" containerID="cri-o://b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b" gracePeriod=30 Apr 16 22:49:41.356118 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356079 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l"] Apr 16 22:49:41.356474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356459 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="storage-initializer" Apr 16 22:49:41.356580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356475 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="storage-initializer" Apr 16 22:49:41.356580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356495 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" Apr 16 22:49:41.356580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356501 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" Apr 16 22:49:41.356580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356511 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kube-rbac-proxy" Apr 16 22:49:41.356580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356517 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kube-rbac-proxy" Apr 16 22:49:41.356580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356565 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kube-rbac-proxy" Apr 16 22:49:41.356580 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.356575 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a97c07b9-b0ca-452b-8a70-d2d55b27f8f5" containerName="kserve-container" Apr 16 22:49:41.359709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.359690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.361917 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.361893 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 16 22:49:41.362119 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.362104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 16 22:49:41.368133 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.368109 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l"] Apr 16 22:49:41.438276 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.438239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa92d46d-7898-4aa4-ad50-b632794686b8-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.438276 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.438282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa92d46d-7898-4aa4-ad50-b632794686b8-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.438522 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.438306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa92d46d-7898-4aa4-ad50-b632794686b8-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.438522 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.438396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78d5h\" (UniqueName: \"kubernetes.io/projected/aa92d46d-7898-4aa4-ad50-b632794686b8-kube-api-access-78d5h\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.539659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.539200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa92d46d-7898-4aa4-ad50-b632794686b8-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.539659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.539253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa92d46d-7898-4aa4-ad50-b632794686b8-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.539659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.539306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa92d46d-7898-4aa4-ad50-b632794686b8-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.539659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.539384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78d5h\" (UniqueName: \"kubernetes.io/projected/aa92d46d-7898-4aa4-ad50-b632794686b8-kube-api-access-78d5h\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.540000 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.539941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa92d46d-7898-4aa4-ad50-b632794686b8-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.540179 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.540155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa92d46d-7898-4aa4-ad50-b632794686b8-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.542232 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.542208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa92d46d-7898-4aa4-ad50-b632794686b8-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.547079 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.547057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78d5h\" (UniqueName: \"kubernetes.io/projected/aa92d46d-7898-4aa4-ad50-b632794686b8-kube-api-access-78d5h\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.632898 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.632862 2576 generic.go:358] "Generic (PLEG): container finished" podID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerID="b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b" exitCode=2 Apr 16 22:49:41.633066 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.632935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerDied","Data":"b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b"} Apr 16 22:49:41.670849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.670818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:41.789195 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:41.789021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l"] Apr 16 22:49:41.791982 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:49:41.791952 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa92d46d_7898_4aa4_ad50_b632794686b8.slice/crio-5c6e5623f2bdd39935d593d26c59498ef7ed85afe1b219001ec87dcc393dd456 WatchSource:0}: Error finding container 5c6e5623f2bdd39935d593d26c59498ef7ed85afe1b219001ec87dcc393dd456: Status 404 returned error can't find the container with id 5c6e5623f2bdd39935d593d26c59498ef7ed85afe1b219001ec87dcc393dd456 Apr 16 22:49:42.373485 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:42.373443 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 16 22:49:42.639658 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:42.639570 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerStarted","Data":"c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c"} Apr 16 22:49:42.639658 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:42.639607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerStarted","Data":"5c6e5623f2bdd39935d593d26c59498ef7ed85afe1b219001ec87dcc393dd456"} Apr 16 22:49:46.010110 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.010082 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:49:46.072019 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.071984 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-proxy-tls\") pod \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " Apr 16 22:49:46.072212 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.072045 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwn4z\" (UniqueName: \"kubernetes.io/projected/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kube-api-access-gwn4z\") pod \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " Apr 16 22:49:46.072212 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.072077 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kserve-provision-location\") pod \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " Apr 16 22:49:46.072212 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.072117 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\" (UID: \"07dac8a7-77c8-4a70-8fbe-6c5141897bdd\") " Apr 16 22:49:46.072470 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.072445 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07dac8a7-77c8-4a70-8fbe-6c5141897bdd" (UID: "07dac8a7-77c8-4a70-8fbe-6c5141897bdd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:49:46.072583 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.072552 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "07dac8a7-77c8-4a70-8fbe-6c5141897bdd" (UID: "07dac8a7-77c8-4a70-8fbe-6c5141897bdd"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:49:46.074184 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.074163 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kube-api-access-gwn4z" (OuterVolumeSpecName: "kube-api-access-gwn4z") pod "07dac8a7-77c8-4a70-8fbe-6c5141897bdd" (UID: "07dac8a7-77c8-4a70-8fbe-6c5141897bdd"). InnerVolumeSpecName "kube-api-access-gwn4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:49:46.074240 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.074173 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "07dac8a7-77c8-4a70-8fbe-6c5141897bdd" (UID: "07dac8a7-77c8-4a70-8fbe-6c5141897bdd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:49:46.173302 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.173249 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:49:46.173516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.173318 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwn4z\" (UniqueName: \"kubernetes.io/projected/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kube-api-access-gwn4z\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:49:46.173516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.173356 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:49:46.173516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.173366 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07dac8a7-77c8-4a70-8fbe-6c5141897bdd-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:49:46.654480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.654444 2576 generic.go:358] "Generic (PLEG): container finished" podID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerID="238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4" exitCode=0 Apr 16 22:49:46.654671 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.654528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerDied","Data":"238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4"} Apr 16 22:49:46.654671 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.654540 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" Apr 16 22:49:46.654671 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.654569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t" event={"ID":"07dac8a7-77c8-4a70-8fbe-6c5141897bdd","Type":"ContainerDied","Data":"16a020d688ece2131a4a6f3e92c328db1475dcc48c717d81246fc1144a713a58"} Apr 16 22:49:46.654671 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.654582 2576 scope.go:117] "RemoveContainer" containerID="b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b" Apr 16 22:49:46.656069 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.656045 2576 generic.go:358] "Generic (PLEG): container finished" podID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerID="c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c" exitCode=0 Apr 16 22:49:46.656180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.656094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerDied","Data":"c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c"} Apr 16 22:49:46.662876 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.662855 2576 scope.go:117] "RemoveContainer" containerID="238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4" Apr 16 22:49:46.671850 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.671832 2576 scope.go:117] "RemoveContainer" containerID="f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc" Apr 16 22:49:46.680073 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.680050 2576 scope.go:117] "RemoveContainer" containerID="b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b" Apr 16 22:49:46.680399 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:49:46.680361 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b\": container with ID starting with b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b not found: ID does not exist" containerID="b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b" Apr 16 22:49:46.680512 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.680397 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b"} err="failed to get container status \"b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b\": rpc error: code = NotFound desc = could not find container \"b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b\": container with ID starting with b98a65f5f6aa10636c96bc1fa6ede57997b3b146228b0d4d240bb3e96eb1343b not found: ID does not exist" Apr 16 22:49:46.680512 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.680427 2576 scope.go:117] "RemoveContainer" containerID="238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4" Apr 16 22:49:46.680699 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:49:46.680677 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4\": container with ID starting with 238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4 not found: ID does not exist" containerID="238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4" Apr 16 22:49:46.680747 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.680711 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4"} err="failed to get container status \"238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4\": rpc error: code = NotFound desc = could not find container \"238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4\": container with ID starting with 238238d576f028302df8be9b4bcb64b4d9ffb289d13f4d9bb896e9a8772212d4 not found: ID does not exist" Apr 16 22:49:46.680747 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.680736 2576 scope.go:117] "RemoveContainer" containerID="f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc" Apr 16 22:49:46.680953 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:49:46.680938 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc\": container with ID starting with f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc not found: ID does not exist" containerID="f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc" Apr 16 22:49:46.681010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.680955 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc"} err="failed to get container status \"f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc\": rpc error: code = NotFound desc = could not find container \"f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc\": container with ID starting with f2f470c0878938e671b9f0b4b8c556a132b85f1aebf77e41d031f23c39b910bc not found: ID does not exist" Apr 16 22:49:46.687311 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.687286 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t"] Apr 16 22:49:46.692768 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:46.692747 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-fhw8t"] Apr 16 22:49:47.453364 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:47.453312 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" path="/var/lib/kubelet/pods/07dac8a7-77c8-4a70-8fbe-6c5141897bdd/volumes" Apr 16 22:49:47.661448 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:47.661417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerStarted","Data":"717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2"} Apr 16 22:49:47.661612 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:47.661454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerStarted","Data":"f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4"} Apr 16 22:49:47.661751 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:47.661734 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:47.661892 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:47.661863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:47.662987 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:47.662960 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:49:47.681115 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:47.681067 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podStartSLOduration=6.681054581 podStartE2EDuration="6.681054581s" podCreationTimestamp="2026-04-16 22:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:49:47.678717482 +0000 UTC m=+2166.803341769" watchObservedRunningTime="2026-04-16 22:49:47.681054581 +0000 UTC m=+2166.805678867" Apr 16 22:49:48.665882 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:48.665844 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:49:53.670170 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:53.670139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:49:53.670733 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:49:53.670704 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:50:03.671640 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:50:03.671600 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:50:13.671068 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:50:13.671028 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:50:23.671658 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:50:23.671620 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:50:33.671226 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:50:33.671185 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:50:43.670631 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:50:43.670592 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:50:53.671389 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:50:53.671345 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:51:03.672187 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:03.672107 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:51:11.456063 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.456029 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l"] Apr 16 22:51:11.456506 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.456422 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" containerID="cri-o://f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4" gracePeriod=30 Apr 16 22:51:11.456578 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.456477 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kube-rbac-proxy" containerID="cri-o://717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2" gracePeriod=30 Apr 16 22:51:11.556578 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.556534 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j"] Apr 16 22:51:11.556923 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.556904 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" Apr 16 22:51:11.557007 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.556925 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" Apr 16 22:51:11.557007 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.556941 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="storage-initializer" Apr 16 22:51:11.557007 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.556949 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="storage-initializer" Apr 16 22:51:11.557007 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.556975 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kube-rbac-proxy" Apr 16 22:51:11.557007 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.556984 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kube-rbac-proxy" Apr 16 22:51:11.557267 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.557052 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kube-rbac-proxy" Apr 16 22:51:11.557267 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.557067 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="07dac8a7-77c8-4a70-8fbe-6c5141897bdd" containerName="kserve-container" Apr 16 22:51:11.561337 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.561297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.563818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.563793 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 16 22:51:11.563818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.563802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 16 22:51:11.568557 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.568533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j"] Apr 16 22:51:11.635358 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.635291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c32839f-549d-453e-9689-3f219c044979-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.635548 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.635368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.635548 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.635398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c32839f-549d-453e-9689-3f219c044979-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.635548 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.635430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvt7w\" (UniqueName: \"kubernetes.io/projected/6c32839f-549d-453e-9689-3f219c044979-kube-api-access-zvt7w\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.735993 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.735897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c32839f-549d-453e-9689-3f219c044979-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.735993 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.735946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.735993 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.735981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c32839f-549d-453e-9689-3f219c044979-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.736264 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.736026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvt7w\" (UniqueName: \"kubernetes.io/projected/6c32839f-549d-453e-9689-3f219c044979-kube-api-access-zvt7w\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.736264 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:51:11.736137 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-serving-cert: secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 16 22:51:11.736264 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:51:11.736209 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls podName:6c32839f-549d-453e-9689-3f219c044979 nodeName:}" failed. No retries permitted until 2026-04-16 22:51:12.236186261 +0000 UTC m=+2251.360810528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls") pod "isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" (UID: "6c32839f-549d-453e-9689-3f219c044979") : secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 16 22:51:11.736488 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.736471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c32839f-549d-453e-9689-3f219c044979-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.736707 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.736689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c32839f-549d-453e-9689-3f219c044979-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.745183 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.745156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvt7w\" (UniqueName: \"kubernetes.io/projected/6c32839f-549d-453e-9689-3f219c044979-kube-api-access-zvt7w\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:11.929678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.929644 2576 generic.go:358] "Generic (PLEG): container finished" podID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerID="717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2" exitCode=2 Apr 16 22:51:11.929844 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:11.929713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerDied","Data":"717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2"} Apr 16 22:51:12.240584 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:12.240547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:12.242982 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:12.242951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:12.472872 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:12.472833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:12.591879 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:12.591849 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j"] Apr 16 22:51:12.595047 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:51:12.595019 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c32839f_549d_453e_9689_3f219c044979.slice/crio-e35ba2007c346c68b5efa9bfb2f2902b8295d9dc6ee41313bb78b9c3dc2e31f4 WatchSource:0}: Error finding container e35ba2007c346c68b5efa9bfb2f2902b8295d9dc6ee41313bb78b9c3dc2e31f4: Status 404 returned error can't find the container with id e35ba2007c346c68b5efa9bfb2f2902b8295d9dc6ee41313bb78b9c3dc2e31f4 Apr 16 22:51:12.935175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:12.935097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerStarted","Data":"ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4"} Apr 16 22:51:12.935175 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:12.935130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerStarted","Data":"e35ba2007c346c68b5efa9bfb2f2902b8295d9dc6ee41313bb78b9c3dc2e31f4"} Apr 16 22:51:13.666658 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:13.666616 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 16 22:51:13.670953 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:13.670925 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 22:51:16.702441 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.702420 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:51:16.780672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.780597 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78d5h\" (UniqueName: \"kubernetes.io/projected/aa92d46d-7898-4aa4-ad50-b632794686b8-kube-api-access-78d5h\") pod \"aa92d46d-7898-4aa4-ad50-b632794686b8\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " Apr 16 22:51:16.780672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.780634 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa92d46d-7898-4aa4-ad50-b632794686b8-proxy-tls\") pod \"aa92d46d-7898-4aa4-ad50-b632794686b8\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " Apr 16 22:51:16.780672 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.780663 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa92d46d-7898-4aa4-ad50-b632794686b8-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"aa92d46d-7898-4aa4-ad50-b632794686b8\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " Apr 16 22:51:16.780954 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.780695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa92d46d-7898-4aa4-ad50-b632794686b8-kserve-provision-location\") pod \"aa92d46d-7898-4aa4-ad50-b632794686b8\" (UID: \"aa92d46d-7898-4aa4-ad50-b632794686b8\") " Apr 16 22:51:16.781091 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.781063 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa92d46d-7898-4aa4-ad50-b632794686b8-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "aa92d46d-7898-4aa4-ad50-b632794686b8" (UID: "aa92d46d-7898-4aa4-ad50-b632794686b8"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:51:16.781168 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.781107 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa92d46d-7898-4aa4-ad50-b632794686b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aa92d46d-7898-4aa4-ad50-b632794686b8" (UID: "aa92d46d-7898-4aa4-ad50-b632794686b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:51:16.782658 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.782626 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa92d46d-7898-4aa4-ad50-b632794686b8-kube-api-access-78d5h" (OuterVolumeSpecName: "kube-api-access-78d5h") pod "aa92d46d-7898-4aa4-ad50-b632794686b8" (UID: "aa92d46d-7898-4aa4-ad50-b632794686b8"). InnerVolumeSpecName "kube-api-access-78d5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:51:16.782750 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.782674 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa92d46d-7898-4aa4-ad50-b632794686b8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "aa92d46d-7898-4aa4-ad50-b632794686b8" (UID: "aa92d46d-7898-4aa4-ad50-b632794686b8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:51:16.881833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.881785 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa92d46d-7898-4aa4-ad50-b632794686b8-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:51:16.881833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.881828 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78d5h\" (UniqueName: \"kubernetes.io/projected/aa92d46d-7898-4aa4-ad50-b632794686b8-kube-api-access-78d5h\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:51:16.881833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.881844 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa92d46d-7898-4aa4-ad50-b632794686b8-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:51:16.882060 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.881859 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aa92d46d-7898-4aa4-ad50-b632794686b8-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:51:16.949014 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.948981 2576 generic.go:358] "Generic (PLEG): container finished" podID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerID="f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4" exitCode=0 Apr 16 22:51:16.949213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.949060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerDied","Data":"f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4"} Apr 16 22:51:16.949213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.949069 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" Apr 16 22:51:16.949213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.949103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l" event={"ID":"aa92d46d-7898-4aa4-ad50-b632794686b8","Type":"ContainerDied","Data":"5c6e5623f2bdd39935d593d26c59498ef7ed85afe1b219001ec87dcc393dd456"} Apr 16 22:51:16.949213 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.949124 2576 scope.go:117] "RemoveContainer" containerID="717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2" Apr 16 22:51:16.950377 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.950352 2576 generic.go:358] "Generic (PLEG): container finished" podID="6c32839f-549d-453e-9689-3f219c044979" containerID="ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4" exitCode=0 Apr 16 22:51:16.950497 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.950382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerDied","Data":"ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4"} Apr 16 22:51:16.958445 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.958373 2576 scope.go:117] "RemoveContainer" containerID="f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4" Apr 16 22:51:16.968398 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.968302 2576 scope.go:117] "RemoveContainer" containerID="c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c" Apr 16 22:51:16.974350 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:51:16.974296 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa92d46d_7898_4aa4_ad50_b632794686b8.slice\": RecentStats: unable to find data in memory cache]" Apr 16 22:51:16.976514 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.976495 2576 scope.go:117] "RemoveContainer" containerID="717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2" Apr 16 22:51:16.976817 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:51:16.976798 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2\": container with ID starting with 717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2 not found: ID does not exist" containerID="717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2" Apr 16 22:51:16.976906 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.976823 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2"} err="failed to get container status \"717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2\": rpc error: code = NotFound desc = could not find container \"717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2\": container with ID starting with 717f5d303fb6d652960672c9540c43ce50ea8066cb31260da182bab930ff83e2 not found: ID does not exist" Apr 16 22:51:16.976906 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.976851 2576 scope.go:117] "RemoveContainer" containerID="f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4" Apr 16 22:51:16.977096 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:51:16.977081 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4\": container with ID starting with f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4 not found: ID does not exist" containerID="f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4" Apr 16 22:51:16.977155 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.977100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4"} err="failed to get container status \"f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4\": rpc error: code = NotFound desc = could not find container \"f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4\": container with ID starting with f0311c43ba5842198a04ec0713e6db193a08474f0d750ff8955c860301bc38c4 not found: ID does not exist" Apr 16 22:51:16.977155 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.977115 2576 scope.go:117] "RemoveContainer" containerID="c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c" Apr 16 22:51:16.977371 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:51:16.977348 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c\": container with ID starting with c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c not found: ID does not exist" containerID="c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c" Apr 16 22:51:16.977447 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.977380 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c"} err="failed to get container status \"c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c\": rpc error: code = NotFound desc = could not find container \"c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c\": container with ID starting with c079dfded59b8cd16b9e84c0e6ddd40499295ce5a892d35ca69f6e8675310c8c not found: ID does not exist" Apr 16 22:51:16.984760 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.984741 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l"] Apr 16 22:51:16.987818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:16.987799 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hwz4l"] Apr 16 22:51:17.453566 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:17.453537 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" path="/var/lib/kubelet/pods/aa92d46d-7898-4aa4-ad50-b632794686b8/volumes" Apr 16 22:51:17.955682 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:17.955652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerStarted","Data":"556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1"} Apr 16 22:51:17.955682 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:17.955685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerStarted","Data":"a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7"} Apr 16 22:51:17.956113 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:17.955887 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:17.956113 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:17.955918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:17.976315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:17.976272 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podStartSLOduration=6.976258847 podStartE2EDuration="6.976258847s" podCreationTimestamp="2026-04-16 22:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:51:17.974272433 +0000 UTC m=+2257.098896720" watchObservedRunningTime="2026-04-16 22:51:17.976258847 +0000 UTC m=+2257.100883134" Apr 16 22:51:23.963896 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:23.963861 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:51:53.965244 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:51:53.965206 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.50:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 22:52:03.964874 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:03.964831 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.50:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 22:52:13.965383 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:13.965344 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.50:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 22:52:23.964881 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:23.964835 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.50:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 22:52:24.449273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:24.449230 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.50:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 22:52:34.453126 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:34.453042 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:52:41.651015 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.650973 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j"] Apr 16 22:52:41.651494 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.651441 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" containerID="cri-o://a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7" gracePeriod=30 Apr 16 22:52:41.651621 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.651507 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kube-rbac-proxy" containerID="cri-o://556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1" gracePeriod=30 Apr 16 22:52:41.750119 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750082 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb"] Apr 16 22:52:41.750414 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750401 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kube-rbac-proxy" Apr 16 22:52:41.750468 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750415 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kube-rbac-proxy" Apr 16 22:52:41.750468 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750425 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="storage-initializer" Apr 16 22:52:41.750468 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750431 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="storage-initializer" Apr 16 22:52:41.750468 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750446 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" Apr 16 22:52:41.750468 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" Apr 16 22:52:41.750627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750497 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kube-rbac-proxy" Apr 16 22:52:41.750627 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.750508 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa92d46d-7898-4aa4-ad50-b632794686b8" containerName="kserve-container" Apr 16 22:52:41.753760 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.753742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.756091 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.756060 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 16 22:52:41.756091 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.756069 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 16 22:52:41.764754 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.764731 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb"] Apr 16 22:52:41.868150 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.868120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad188539-3912-4b8c-b34c-8734fdcf6bbf-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.868359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.868160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad188539-3912-4b8c-b34c-8734fdcf6bbf-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.868359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.868235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgvt8\" (UniqueName: \"kubernetes.io/projected/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kube-api-access-sgvt8\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.868359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.868342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.969421 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.969384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.969623 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.969438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad188539-3912-4b8c-b34c-8734fdcf6bbf-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.969623 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.969459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad188539-3912-4b8c-b34c-8734fdcf6bbf-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.969623 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.969503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgvt8\" (UniqueName: \"kubernetes.io/projected/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kube-api-access-sgvt8\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.975567 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.969797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.975567 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.970689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad188539-3912-4b8c-b34c-8734fdcf6bbf-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.975567 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.972909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad188539-3912-4b8c-b34c-8734fdcf6bbf-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:41.977746 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:41.977716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgvt8\" (UniqueName: \"kubernetes.io/projected/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kube-api-access-sgvt8\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:42.065474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:42.065433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:42.188954 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:42.188911 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb"] Apr 16 22:52:42.191663 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:52:42.191624 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad188539_3912_4b8c_b34c_8734fdcf6bbf.slice/crio-a2d505460ab1479befe198d21310f0a980448713f96d213a4b08eadb4c1a9a73 WatchSource:0}: Error finding container a2d505460ab1479befe198d21310f0a980448713f96d213a4b08eadb4c1a9a73: Status 404 returned error can't find the container with id a2d505460ab1479befe198d21310f0a980448713f96d213a4b08eadb4c1a9a73 Apr 16 22:52:42.193442 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:42.193427 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:52:42.222349 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:42.222266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerStarted","Data":"a2d505460ab1479befe198d21310f0a980448713f96d213a4b08eadb4c1a9a73"} Apr 16 22:52:42.223935 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:42.223912 2576 generic.go:358] "Generic (PLEG): container finished" podID="6c32839f-549d-453e-9689-3f219c044979" containerID="556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1" exitCode=2 Apr 16 22:52:42.224026 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:42.223945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerDied","Data":"556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1"} Apr 16 22:52:43.229353 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:43.229296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerStarted","Data":"13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af"} Apr 16 22:52:43.959707 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:43.959663 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 16 22:52:44.449482 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:44.449441 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.50:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 22:52:46.240646 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.240606 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerID="13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af" exitCode=0 Apr 16 22:52:46.241048 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.240677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerDied","Data":"13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af"} Apr 16 22:52:46.491380 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.491357 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:52:46.609010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.608909 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c32839f-549d-453e-9689-3f219c044979-kserve-provision-location\") pod \"6c32839f-549d-453e-9689-3f219c044979\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " Apr 16 22:52:46.609010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.608962 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c32839f-549d-453e-9689-3f219c044979-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"6c32839f-549d-453e-9689-3f219c044979\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " Apr 16 22:52:46.609010 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.609022 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls\") pod \"6c32839f-549d-453e-9689-3f219c044979\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " Apr 16 22:52:46.609315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.609046 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvt7w\" (UniqueName: \"kubernetes.io/projected/6c32839f-549d-453e-9689-3f219c044979-kube-api-access-zvt7w\") pod \"6c32839f-549d-453e-9689-3f219c044979\" (UID: \"6c32839f-549d-453e-9689-3f219c044979\") " Apr 16 22:52:46.609315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.609212 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c32839f-549d-453e-9689-3f219c044979-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6c32839f-549d-453e-9689-3f219c044979" (UID: "6c32839f-549d-453e-9689-3f219c044979"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:52:46.609315 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.609301 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c32839f-549d-453e-9689-3f219c044979-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "6c32839f-549d-453e-9689-3f219c044979" (UID: "6c32839f-549d-453e-9689-3f219c044979"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:52:46.611146 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.611124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c32839f-549d-453e-9689-3f219c044979-kube-api-access-zvt7w" (OuterVolumeSpecName: "kube-api-access-zvt7w") pod "6c32839f-549d-453e-9689-3f219c044979" (UID: "6c32839f-549d-453e-9689-3f219c044979"). InnerVolumeSpecName "kube-api-access-zvt7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:52:46.611222 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.611164 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6c32839f-549d-453e-9689-3f219c044979" (UID: "6c32839f-549d-453e-9689-3f219c044979"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:52:46.709851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.709799 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c32839f-549d-453e-9689-3f219c044979-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:52:46.709851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.709842 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvt7w\" (UniqueName: \"kubernetes.io/projected/6c32839f-549d-453e-9689-3f219c044979-kube-api-access-zvt7w\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:52:46.709851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.709852 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c32839f-549d-453e-9689-3f219c044979-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:52:46.709851 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:46.709862 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c32839f-549d-453e-9689-3f219c044979-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:52:47.244870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.244833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerStarted","Data":"8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46"} Apr 16 22:52:47.244870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.244874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerStarted","Data":"e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73"} Apr 16 22:52:47.245445 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.245096 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:47.246499 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.246475 2576 generic.go:358] "Generic (PLEG): container finished" podID="6c32839f-549d-453e-9689-3f219c044979" containerID="a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7" exitCode=0 Apr 16 22:52:47.246639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.246538 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" Apr 16 22:52:47.246639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.246542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerDied","Data":"a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7"} Apr 16 22:52:47.246639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.246565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j" event={"ID":"6c32839f-549d-453e-9689-3f219c044979","Type":"ContainerDied","Data":"e35ba2007c346c68b5efa9bfb2f2902b8295d9dc6ee41313bb78b9c3dc2e31f4"} Apr 16 22:52:47.246639 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.246580 2576 scope.go:117] "RemoveContainer" containerID="556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1" Apr 16 22:52:47.254818 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.254746 2576 scope.go:117] "RemoveContainer" containerID="a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7" Apr 16 22:52:47.261761 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.261743 2576 scope.go:117] "RemoveContainer" containerID="ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4" Apr 16 22:52:47.264554 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.264505 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podStartSLOduration=6.264489294 podStartE2EDuration="6.264489294s" podCreationTimestamp="2026-04-16 22:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:52:47.263662001 +0000 UTC m=+2346.388286287" watchObservedRunningTime="2026-04-16 22:52:47.264489294 +0000 UTC m=+2346.389113582" Apr 16 22:52:47.272104 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.272085 2576 scope.go:117] "RemoveContainer" containerID="556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1" Apr 16 22:52:47.272748 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:52:47.272723 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1\": container with ID starting with 556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1 not found: ID does not exist" containerID="556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1" Apr 16 22:52:47.272849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.272758 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1"} err="failed to get container status \"556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1\": rpc error: code = NotFound desc = could not find container \"556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1\": container with ID starting with 556a72b93f88213b23c684124f90edfa0ab6fdb57806971c057475b49bc5f5f1 not found: ID does not exist" Apr 16 22:52:47.272849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.272783 2576 scope.go:117] "RemoveContainer" containerID="a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7" Apr 16 22:52:47.273043 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:52:47.273021 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7\": container with ID starting with a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7 not found: ID does not exist" containerID="a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7" Apr 16 22:52:47.273093 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.273051 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7"} err="failed to get container status \"a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7\": rpc error: code = NotFound desc = could not find container \"a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7\": container with ID starting with a5deed15085e6bc11acf29677dc57a13f7773166eb34280c13e8aba16e75e8b7 not found: ID does not exist" Apr 16 22:52:47.273093 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.273072 2576 scope.go:117] "RemoveContainer" containerID="ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4" Apr 16 22:52:47.273340 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:52:47.273309 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4\": container with ID starting with ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4 not found: ID does not exist" containerID="ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4" Apr 16 22:52:47.273408 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.273345 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4"} err="failed to get container status \"ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4\": rpc error: code = NotFound desc = could not find container \"ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4\": container with ID starting with ad9a2e1f4db02b4a5dcc0e78a31d0280fc25ab89efbe265725c80d0cb4e002e4 not found: ID does not exist" Apr 16 22:52:47.281910 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.281888 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j"] Apr 16 22:52:47.285823 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.285803 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-8ns7j"] Apr 16 22:52:47.452800 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:47.452770 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c32839f-549d-453e-9689-3f219c044979" path="/var/lib/kubelet/pods/6c32839f-549d-453e-9689-3f219c044979/volumes" Apr 16 22:52:48.251547 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:48.251516 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:52:54.259632 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:52:54.259602 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:53:24.260552 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:53:24.260511 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 22:53:34.260115 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:53:34.260074 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 22:53:44.260989 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:53:44.260946 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 22:53:54.260141 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:53:54.260104 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 22:54:04.263800 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:04.263714 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:54:11.918700 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.918666 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb"] Apr 16 22:54:11.919182 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.919098 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" containerID="cri-o://e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73" gracePeriod=30 Apr 16 22:54:11.919279 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.919166 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kube-rbac-proxy" containerID="cri-o://8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46" gracePeriod=30 Apr 16 22:54:11.964872 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.964846 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp"] Apr 16 22:54:11.965160 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965143 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="storage-initializer" Apr 16 22:54:11.965160 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965159 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="storage-initializer" Apr 16 22:54:11.965314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965174 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kube-rbac-proxy" Apr 16 22:54:11.965314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965180 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kube-rbac-proxy" Apr 16 22:54:11.965314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965190 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" Apr 16 22:54:11.965314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965196 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" Apr 16 22:54:11.965314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965247 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kube-rbac-proxy" Apr 16 22:54:11.965314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.965256 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c32839f-549d-453e-9689-3f219c044979" containerName="kserve-container" Apr 16 22:54:11.968435 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.968418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:11.970585 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.970567 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 16 22:54:11.970718 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.970697 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 16 22:54:11.979008 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:11.978986 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp"] Apr 16 22:54:12.062418 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.062383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd49816e-f69f-4a30-be7e-166a709fb35c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.062568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.062442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfhd\" (UniqueName: \"kubernetes.io/projected/fd49816e-f69f-4a30-be7e-166a709fb35c-kube-api-access-4kfhd\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.062568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.062486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.062568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.062507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd49816e-f69f-4a30-be7e-166a709fb35c-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.163871 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.163829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfhd\" (UniqueName: \"kubernetes.io/projected/fd49816e-f69f-4a30-be7e-166a709fb35c-kube-api-access-4kfhd\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.164044 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.163882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.164044 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.163915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd49816e-f69f-4a30-be7e-166a709fb35c-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.164044 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.163980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd49816e-f69f-4a30-be7e-166a709fb35c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.164289 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:54:12.164049 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-serving-cert: secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 16 22:54:12.164289 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:54:12.164135 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls podName:fd49816e-f69f-4a30-be7e-166a709fb35c nodeName:}" failed. No retries permitted until 2026-04-16 22:54:12.664116472 +0000 UTC m=+2431.788740741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls") pod "isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" (UID: "fd49816e-f69f-4a30-be7e-166a709fb35c") : secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 16 22:54:12.164480 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.164423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd49816e-f69f-4a30-be7e-166a709fb35c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.164709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.164689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd49816e-f69f-4a30-be7e-166a709fb35c-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.174082 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.174024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfhd\" (UniqueName: \"kubernetes.io/projected/fd49816e-f69f-4a30-be7e-166a709fb35c-kube-api-access-4kfhd\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.516087 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.516057 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerID="8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46" exitCode=2 Apr 16 22:54:12.518027 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.516134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerDied","Data":"8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46"} Apr 16 22:54:12.667670 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.667632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.670047 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.670028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:12.878555 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:12.878470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:13.001161 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:13.001129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp"] Apr 16 22:54:13.004724 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:54:13.004700 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd49816e_f69f_4a30_be7e_166a709fb35c.slice/crio-ea3e65571b1d14bf8ebf96e528447c81dbaedb9a0badff2b9144089f7ab02aa3 WatchSource:0}: Error finding container ea3e65571b1d14bf8ebf96e528447c81dbaedb9a0badff2b9144089f7ab02aa3: Status 404 returned error can't find the container with id ea3e65571b1d14bf8ebf96e528447c81dbaedb9a0badff2b9144089f7ab02aa3 Apr 16 22:54:13.520500 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:13.520464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerStarted","Data":"3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603"} Apr 16 22:54:13.520500 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:13.520503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerStarted","Data":"ea3e65571b1d14bf8ebf96e528447c81dbaedb9a0badff2b9144089f7ab02aa3"} Apr 16 22:54:14.254253 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:14.254209 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.51:8643/healthz\": dial tcp 10.133.0.51:8643: connect: connection refused" Apr 16 22:54:14.260343 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:14.260298 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 22:54:16.665935 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.665911 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:54:16.800769 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.800698 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgvt8\" (UniqueName: \"kubernetes.io/projected/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kube-api-access-sgvt8\") pod \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " Apr 16 22:54:16.800769 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.800756 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad188539-3912-4b8c-b34c-8734fdcf6bbf-proxy-tls\") pod \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " Apr 16 22:54:16.800969 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.800783 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad188539-3912-4b8c-b34c-8734fdcf6bbf-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " Apr 16 22:54:16.800969 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.800815 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kserve-provision-location\") pod \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\" (UID: \"ad188539-3912-4b8c-b34c-8734fdcf6bbf\") " Apr 16 22:54:16.801191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.801153 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad188539-3912-4b8c-b34c-8734fdcf6bbf" (UID: "ad188539-3912-4b8c-b34c-8734fdcf6bbf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:54:16.801191 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.801178 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad188539-3912-4b8c-b34c-8734fdcf6bbf-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "ad188539-3912-4b8c-b34c-8734fdcf6bbf" (UID: "ad188539-3912-4b8c-b34c-8734fdcf6bbf"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:54:16.802823 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.802784 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kube-api-access-sgvt8" (OuterVolumeSpecName: "kube-api-access-sgvt8") pod "ad188539-3912-4b8c-b34c-8734fdcf6bbf" (UID: "ad188539-3912-4b8c-b34c-8734fdcf6bbf"). InnerVolumeSpecName "kube-api-access-sgvt8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:54:16.802980 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.802830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad188539-3912-4b8c-b34c-8734fdcf6bbf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ad188539-3912-4b8c-b34c-8734fdcf6bbf" (UID: "ad188539-3912-4b8c-b34c-8734fdcf6bbf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:54:16.901981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.901949 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:54:16.901981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.901978 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sgvt8\" (UniqueName: \"kubernetes.io/projected/ad188539-3912-4b8c-b34c-8734fdcf6bbf-kube-api-access-sgvt8\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:54:16.901981 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.901989 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad188539-3912-4b8c-b34c-8734fdcf6bbf-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:54:16.902211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:16.901999 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad188539-3912-4b8c-b34c-8734fdcf6bbf-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:54:17.534834 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.534802 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerID="e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73" exitCode=0 Apr 16 22:54:17.535036 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.534878 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" Apr 16 22:54:17.535036 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.534883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerDied","Data":"e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73"} Apr 16 22:54:17.535036 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.534920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb" event={"ID":"ad188539-3912-4b8c-b34c-8734fdcf6bbf","Type":"ContainerDied","Data":"a2d505460ab1479befe198d21310f0a980448713f96d213a4b08eadb4c1a9a73"} Apr 16 22:54:17.535036 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.534936 2576 scope.go:117] "RemoveContainer" containerID="8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46" Apr 16 22:54:17.536870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.536842 2576 generic.go:358] "Generic (PLEG): container finished" podID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerID="3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603" exitCode=0 Apr 16 22:54:17.537000 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.536894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerDied","Data":"3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603"} Apr 16 22:54:17.542855 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.542824 2576 scope.go:117] "RemoveContainer" containerID="e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73" Apr 16 22:54:17.549962 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.549946 2576 scope.go:117] "RemoveContainer" containerID="13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.564192 2576 scope.go:117] "RemoveContainer" containerID="8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:54:17.564648 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46\": container with ID starting with 8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46 not found: ID does not exist" containerID="8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.564678 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46"} err="failed to get container status \"8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46\": rpc error: code = NotFound desc = could not find container \"8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46\": container with ID starting with 8cb61e3b11ca7b8fbebd1a3020aea2d8110bc3596304baea57ed12991adccf46 not found: ID does not exist" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.564699 2576 scope.go:117] "RemoveContainer" containerID="e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:54:17.564958 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73\": container with ID starting with e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73 not found: ID does not exist" containerID="e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.564982 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73"} err="failed to get container status \"e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73\": rpc error: code = NotFound desc = could not find container \"e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73\": container with ID starting with e756cc8b9a4b64efeb39646e298085f09265adc92330cdd47e7c72bac6847c73 not found: ID does not exist" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.564999 2576 scope.go:117] "RemoveContainer" containerID="13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:54:17.565248 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af\": container with ID starting with 13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af not found: ID does not exist" containerID="13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af" Apr 16 22:54:17.566354 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.565270 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af"} err="failed to get container status \"13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af\": rpc error: code = NotFound desc = could not find container \"13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af\": container with ID starting with 13cee016017a0ea274173b53f9ee54a33c43eb3c2614f0bede7d2447855926af not found: ID does not exist" Apr 16 22:54:17.569759 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.569733 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb"] Apr 16 22:54:17.572382 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:17.572361 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5sqbb"] Apr 16 22:54:18.543460 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:18.543423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerStarted","Data":"bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827"} Apr 16 22:54:18.543460 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:18.543460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerStarted","Data":"8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e"} Apr 16 22:54:18.543919 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:18.543683 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:18.543919 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:18.543751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:18.560826 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:18.560780 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podStartSLOduration=7.560767659 podStartE2EDuration="7.560767659s" podCreationTimestamp="2026-04-16 22:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:54:18.559633468 +0000 UTC m=+2437.684257757" watchObservedRunningTime="2026-04-16 22:54:18.560767659 +0000 UTC m=+2437.685391946" Apr 16 22:54:19.459253 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:19.459204 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" path="/var/lib/kubelet/pods/ad188539-3912-4b8c-b34c-8734fdcf6bbf/volumes" Apr 16 22:54:24.552196 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:24.552165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:54:54.553134 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:54:54.553095 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 22:55:04.552650 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:04.552608 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 22:55:14.552734 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:14.552646 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 22:55:24.553344 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:24.553296 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 22:55:34.555752 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:34.555670 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:55:42.039078 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:42.039046 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp"] Apr 16 22:55:42.039576 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:42.039367 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" containerID="cri-o://8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e" gracePeriod=30 Apr 16 22:55:42.039576 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:42.039421 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kube-rbac-proxy" containerID="cri-o://bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827" gracePeriod=30 Apr 16 22:55:42.812843 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:42.812807 2576 generic.go:358] "Generic (PLEG): container finished" podID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerID="bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827" exitCode=2 Apr 16 22:55:42.813015 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:42.812883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerDied","Data":"bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827"} Apr 16 22:55:44.211899 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.211861 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm"] Apr 16 22:55:44.212263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212203 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" Apr 16 22:55:44.212263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212215 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" Apr 16 22:55:44.212263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212226 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="storage-initializer" Apr 16 22:55:44.212263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212232 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="storage-initializer" Apr 16 22:55:44.212263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212240 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kube-rbac-proxy" Apr 16 22:55:44.212263 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212245 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kube-rbac-proxy" Apr 16 22:55:44.212486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212302 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kube-rbac-proxy" Apr 16 22:55:44.212486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.212311 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad188539-3912-4b8c-b34c-8734fdcf6bbf" containerName="kserve-container" Apr 16 22:55:44.215425 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.215405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.217804 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.217783 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 16 22:55:44.217911 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.217783 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 16 22:55:44.225903 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.225882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm"] Apr 16 22:55:44.298109 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.298071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.298109 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.298112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd56n\" (UniqueName: \"kubernetes.io/projected/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kube-api-access-dd56n\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.298363 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.298141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.298363 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.298196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.398690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.398659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.398849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.398711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.398849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.398743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd56n\" (UniqueName: \"kubernetes.io/projected/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kube-api-access-dd56n\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.398849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.398787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.399205 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.399187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.399448 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.399430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.401250 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.401231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.407075 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.407052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd56n\" (UniqueName: \"kubernetes.io/projected/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kube-api-access-dd56n\") pod \"isvc-sklearn-predictor-d8dbfbbb9-j5mxm\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.526014 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.525926 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:44.548073 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.548034 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 16 22:55:44.553241 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.553209 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 22:55:44.646241 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.646208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm"] Apr 16 22:55:44.651873 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:55:44.651803 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddfa3d90_c606_4e26_8009_8cb2d1e1586f.slice/crio-5d5bc4bc45eb66076b71ce84f9d747e860520b1e3f27688dc3921895d0eccd07 WatchSource:0}: Error finding container 5d5bc4bc45eb66076b71ce84f9d747e860520b1e3f27688dc3921895d0eccd07: Status 404 returned error can't find the container with id 5d5bc4bc45eb66076b71ce84f9d747e860520b1e3f27688dc3921895d0eccd07 Apr 16 22:55:44.821206 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.821126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerStarted","Data":"9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001"} Apr 16 22:55:44.821206 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:44.821158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerStarted","Data":"5d5bc4bc45eb66076b71ce84f9d747e860520b1e3f27688dc3921895d0eccd07"} Apr 16 22:55:47.293957 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.293935 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:55:47.315731 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.315700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kfhd\" (UniqueName: \"kubernetes.io/projected/fd49816e-f69f-4a30-be7e-166a709fb35c-kube-api-access-4kfhd\") pod \"fd49816e-f69f-4a30-be7e-166a709fb35c\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " Apr 16 22:55:47.315849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.315751 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls\") pod \"fd49816e-f69f-4a30-be7e-166a709fb35c\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " Apr 16 22:55:47.315896 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.315879 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd49816e-f69f-4a30-be7e-166a709fb35c-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"fd49816e-f69f-4a30-be7e-166a709fb35c\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " Apr 16 22:55:47.315964 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.315940 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd49816e-f69f-4a30-be7e-166a709fb35c-kserve-provision-location\") pod \"fd49816e-f69f-4a30-be7e-166a709fb35c\" (UID: \"fd49816e-f69f-4a30-be7e-166a709fb35c\") " Apr 16 22:55:47.316244 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.316220 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd49816e-f69f-4a30-be7e-166a709fb35c-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "fd49816e-f69f-4a30-be7e-166a709fb35c" (UID: "fd49816e-f69f-4a30-be7e-166a709fb35c"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:55:47.316373 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.316340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd49816e-f69f-4a30-be7e-166a709fb35c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fd49816e-f69f-4a30-be7e-166a709fb35c" (UID: "fd49816e-f69f-4a30-be7e-166a709fb35c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:55:47.317833 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.317805 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fd49816e-f69f-4a30-be7e-166a709fb35c" (UID: "fd49816e-f69f-4a30-be7e-166a709fb35c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:55:47.317927 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.317892 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd49816e-f69f-4a30-be7e-166a709fb35c-kube-api-access-4kfhd" (OuterVolumeSpecName: "kube-api-access-4kfhd") pod "fd49816e-f69f-4a30-be7e-166a709fb35c" (UID: "fd49816e-f69f-4a30-be7e-166a709fb35c"). InnerVolumeSpecName "kube-api-access-4kfhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:55:47.416579 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.416511 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kfhd\" (UniqueName: \"kubernetes.io/projected/fd49816e-f69f-4a30-be7e-166a709fb35c-kube-api-access-4kfhd\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:55:47.416579 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.416537 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd49816e-f69f-4a30-be7e-166a709fb35c-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:55:47.416579 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.416548 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd49816e-f69f-4a30-be7e-166a709fb35c-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:55:47.416579 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.416560 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd49816e-f69f-4a30-be7e-166a709fb35c-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:55:47.833087 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.833051 2576 generic.go:358] "Generic (PLEG): container finished" podID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerID="8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e" exitCode=0 Apr 16 22:55:47.833266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.833125 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" Apr 16 22:55:47.833266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.833126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerDied","Data":"8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e"} Apr 16 22:55:47.833266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.833165 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp" event={"ID":"fd49816e-f69f-4a30-be7e-166a709fb35c","Type":"ContainerDied","Data":"ea3e65571b1d14bf8ebf96e528447c81dbaedb9a0badff2b9144089f7ab02aa3"} Apr 16 22:55:47.833266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.833184 2576 scope.go:117] "RemoveContainer" containerID="bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827" Apr 16 22:55:47.840695 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.840674 2576 scope.go:117] "RemoveContainer" containerID="8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e" Apr 16 22:55:47.847471 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.847438 2576 scope.go:117] "RemoveContainer" containerID="3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603" Apr 16 22:55:47.850198 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.850172 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp"] Apr 16 22:55:47.853230 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.853207 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-t58bp"] Apr 16 22:55:47.854689 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.854670 2576 scope.go:117] "RemoveContainer" containerID="bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827" Apr 16 22:55:47.854930 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:55:47.854911 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827\": container with ID starting with bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827 not found: ID does not exist" containerID="bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827" Apr 16 22:55:47.854980 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.854945 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827"} err="failed to get container status \"bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827\": rpc error: code = NotFound desc = could not find container \"bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827\": container with ID starting with bb1ae8d6ab19e4b98395aa936296efe158498f3d599452db25e44e016c771827 not found: ID does not exist" Apr 16 22:55:47.854980 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.854969 2576 scope.go:117] "RemoveContainer" containerID="8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e" Apr 16 22:55:47.855178 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:55:47.855162 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e\": container with ID starting with 8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e not found: ID does not exist" containerID="8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e" Apr 16 22:55:47.855215 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.855184 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e"} err="failed to get container status \"8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e\": rpc error: code = NotFound desc = could not find container \"8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e\": container with ID starting with 8f43150070eb47ddc94dc678ee0a9b82942d94e7ad475c24fac3d9ef3333a12e not found: ID does not exist" Apr 16 22:55:47.855215 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.855199 2576 scope.go:117] "RemoveContainer" containerID="3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603" Apr 16 22:55:47.855520 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:55:47.855498 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603\": container with ID starting with 3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603 not found: ID does not exist" containerID="3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603" Apr 16 22:55:47.855604 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:47.855522 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603"} err="failed to get container status \"3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603\": rpc error: code = NotFound desc = could not find container \"3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603\": container with ID starting with 3352a5807179345151ccd5ba58230263c2e891a3e0a3af72d134155271ed2603 not found: ID does not exist" Apr 16 22:55:48.838462 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:48.838377 2576 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerID="9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001" exitCode=0 Apr 16 22:55:48.838462 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:48.838437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerDied","Data":"9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001"} Apr 16 22:55:49.453602 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:49.453571 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" path="/var/lib/kubelet/pods/fd49816e-f69f-4a30-be7e-166a709fb35c/volumes" Apr 16 22:55:49.844087 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:49.844003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerStarted","Data":"6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0"} Apr 16 22:55:49.844087 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:49.844043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerStarted","Data":"c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be"} Apr 16 22:55:49.844545 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:49.844353 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:49.844545 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:49.844511 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:49.845783 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:49.845756 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:55:49.867865 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:49.867819 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podStartSLOduration=5.867804026 podStartE2EDuration="5.867804026s" podCreationTimestamp="2026-04-16 22:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:55:49.867060354 +0000 UTC m=+2528.991684641" watchObservedRunningTime="2026-04-16 22:55:49.867804026 +0000 UTC m=+2528.992428315" Apr 16 22:55:50.847662 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:50.847614 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:55:55.851645 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:55.851619 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:55:55.852155 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:55:55.852130 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:56:05.852778 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:56:05.852739 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:56:15.852051 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:56:15.852010 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:56:25.852539 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:56:25.852498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:56:35.852900 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:56:35.852860 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:56:45.852883 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:56:45.852846 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:56:55.853491 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:56:55.853461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:57:04.300654 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.300576 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm"] Apr 16 22:57:04.301045 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.300893 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" containerID="cri-o://c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be" gracePeriod=30 Apr 16 22:57:04.301045 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.300941 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kube-rbac-proxy" containerID="cri-o://6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0" gracePeriod=30 Apr 16 22:57:04.392937 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.392900 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh"] Apr 16 22:57:04.393411 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393395 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kube-rbac-proxy" Apr 16 22:57:04.393469 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393415 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kube-rbac-proxy" Apr 16 22:57:04.393469 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393445 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="storage-initializer" Apr 16 22:57:04.393469 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393455 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="storage-initializer" Apr 16 22:57:04.393469 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393467 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" Apr 16 22:57:04.393600 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393476 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" Apr 16 22:57:04.393600 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393549 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kube-rbac-proxy" Apr 16 22:57:04.393600 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.393563 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd49816e-f69f-4a30-be7e-166a709fb35c" containerName="kserve-container" Apr 16 22:57:04.397115 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.397093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.399579 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.399557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 16 22:57:04.399720 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.399598 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 16 22:57:04.407841 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.407815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh"] Apr 16 22:57:04.492002 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.491964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.492197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.492008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed14173-a9fa-4a1a-9883-7ca59c34722b-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.492197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.492082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwpfr\" (UniqueName: \"kubernetes.io/projected/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kube-api-access-wwpfr\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.492197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.492134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ed14173-a9fa-4a1a-9883-7ca59c34722b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.593200 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.593117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ed14173-a9fa-4a1a-9883-7ca59c34722b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.593200 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.593174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.593421 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.593205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed14173-a9fa-4a1a-9883-7ca59c34722b-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.593421 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.593252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwpfr\" (UniqueName: \"kubernetes.io/projected/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kube-api-access-wwpfr\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.593690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.593658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.593889 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.593868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ed14173-a9fa-4a1a-9883-7ca59c34722b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.595755 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.595735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed14173-a9fa-4a1a-9883-7ca59c34722b-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.600554 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.600535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwpfr\" (UniqueName: \"kubernetes.io/projected/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kube-api-access-wwpfr\") pod \"sklearn-v2-mlserver-predictor-65d8664766-f4fdh\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.708740 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.708708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:04.830314 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:04.830289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh"] Apr 16 22:57:04.832757 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:57:04.832731 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed14173_a9fa_4a1a_9883_7ca59c34722b.slice/crio-51f80145c556c1bcdc770c806b2a59b70f709d818afcf65c633a2417c854363c WatchSource:0}: Error finding container 51f80145c556c1bcdc770c806b2a59b70f709d818afcf65c633a2417c854363c: Status 404 returned error can't find the container with id 51f80145c556c1bcdc770c806b2a59b70f709d818afcf65c633a2417c854363c Apr 16 22:57:05.084276 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:05.084245 2576 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerID="6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0" exitCode=2 Apr 16 22:57:05.084467 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:05.084317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerDied","Data":"6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0"} Apr 16 22:57:05.085628 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:05.085600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerStarted","Data":"17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8"} Apr 16 22:57:05.085726 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:05.085636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerStarted","Data":"51f80145c556c1bcdc770c806b2a59b70f709d818afcf65c633a2417c854363c"} Apr 16 22:57:05.848387 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:05.848316 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 16 22:57:05.852624 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:05.852596 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 22:57:08.646382 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.646354 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:57:08.825663 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.825621 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd56n\" (UniqueName: \"kubernetes.io/projected/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kube-api-access-dd56n\") pod \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " Apr 16 22:57:08.825663 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.825673 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " Apr 16 22:57:08.825891 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.825700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kserve-provision-location\") pod \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " Apr 16 22:57:08.825891 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.825755 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-proxy-tls\") pod \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\" (UID: \"ddfa3d90-c606-4e26-8009-8cb2d1e1586f\") " Apr 16 22:57:08.826062 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.826038 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ddfa3d90-c606-4e26-8009-8cb2d1e1586f" (UID: "ddfa3d90-c606-4e26-8009-8cb2d1e1586f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:57:08.826114 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.826039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "ddfa3d90-c606-4e26-8009-8cb2d1e1586f" (UID: "ddfa3d90-c606-4e26-8009-8cb2d1e1586f"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:57:08.827921 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.827898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ddfa3d90-c606-4e26-8009-8cb2d1e1586f" (UID: "ddfa3d90-c606-4e26-8009-8cb2d1e1586f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:57:08.827977 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.827966 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kube-api-access-dd56n" (OuterVolumeSpecName: "kube-api-access-dd56n") pod "ddfa3d90-c606-4e26-8009-8cb2d1e1586f" (UID: "ddfa3d90-c606-4e26-8009-8cb2d1e1586f"). InnerVolumeSpecName "kube-api-access-dd56n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:57:08.927266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.927222 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:57:08.927266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.927263 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:57:08.927266 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.927277 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:57:08.927519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:08.927287 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dd56n\" (UniqueName: \"kubernetes.io/projected/ddfa3d90-c606-4e26-8009-8cb2d1e1586f-kube-api-access-dd56n\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:57:09.099251 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.099166 2576 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerID="c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be" exitCode=0 Apr 16 22:57:09.099251 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.099238 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" Apr 16 22:57:09.099486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.099233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerDied","Data":"c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be"} Apr 16 22:57:09.099486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.099365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm" event={"ID":"ddfa3d90-c606-4e26-8009-8cb2d1e1586f","Type":"ContainerDied","Data":"5d5bc4bc45eb66076b71ce84f9d747e860520b1e3f27688dc3921895d0eccd07"} Apr 16 22:57:09.099486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.099397 2576 scope.go:117] "RemoveContainer" containerID="6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0" Apr 16 22:57:09.100687 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.100667 2576 generic.go:358] "Generic (PLEG): container finished" podID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerID="17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8" exitCode=0 Apr 16 22:57:09.100801 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.100738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerDied","Data":"17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8"} Apr 16 22:57:09.108146 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.108130 2576 scope.go:117] "RemoveContainer" containerID="c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be" Apr 16 22:57:09.115274 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.115255 2576 scope.go:117] "RemoveContainer" containerID="9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001" Apr 16 22:57:09.124161 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.124129 2576 scope.go:117] "RemoveContainer" containerID="6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0" Apr 16 22:57:09.124489 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:57:09.124466 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0\": container with ID starting with 6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0 not found: ID does not exist" containerID="6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0" Apr 16 22:57:09.124591 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.124502 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0"} err="failed to get container status \"6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0\": rpc error: code = NotFound desc = could not find container \"6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0\": container with ID starting with 6255eff7b0044523a9a2b1816ac185814b19fb03d70f154f38ec19621ffae0d0 not found: ID does not exist" Apr 16 22:57:09.124591 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.124529 2576 scope.go:117] "RemoveContainer" containerID="c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be" Apr 16 22:57:09.124833 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:57:09.124815 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be\": container with ID starting with c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be not found: ID does not exist" containerID="c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be" Apr 16 22:57:09.124896 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.124840 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be"} err="failed to get container status \"c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be\": rpc error: code = NotFound desc = could not find container \"c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be\": container with ID starting with c051154b6702a27bbf43791d77a41dba86fc3cefe7072acd8cbda46eeee026be not found: ID does not exist" Apr 16 22:57:09.124896 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.124858 2576 scope.go:117] "RemoveContainer" containerID="9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001" Apr 16 22:57:09.125147 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:57:09.125107 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001\": container with ID starting with 9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001 not found: ID does not exist" containerID="9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001" Apr 16 22:57:09.125260 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.125151 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001"} err="failed to get container status \"9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001\": rpc error: code = NotFound desc = could not find container \"9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001\": container with ID starting with 9beb840f7278d6315ff6b334a3e5540e23af431132cb0a9aa40a433757cc7001 not found: ID does not exist" Apr 16 22:57:09.133063 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.133039 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm"] Apr 16 22:57:09.138709 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.138686 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-j5mxm"] Apr 16 22:57:09.453159 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:09.453124 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" path="/var/lib/kubelet/pods/ddfa3d90-c606-4e26-8009-8cb2d1e1586f/volumes" Apr 16 22:57:10.107255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:10.107222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerStarted","Data":"7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a"} Apr 16 22:57:10.107255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:10.107259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerStarted","Data":"9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b"} Apr 16 22:57:10.107723 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:10.107589 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:10.107723 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:10.107632 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:10.125664 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:10.125605 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" podStartSLOduration=6.125588101 podStartE2EDuration="6.125588101s" podCreationTimestamp="2026-04-16 22:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:57:10.124145743 +0000 UTC m=+2609.248770030" watchObservedRunningTime="2026-04-16 22:57:10.125588101 +0000 UTC m=+2609.250212391" Apr 16 22:57:16.118316 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:16.118289 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:57:46.137122 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:46.137076 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 22:57:56.121271 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:57:56.121237 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:58:04.494555 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.494520 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh"] Apr 16 22:58:04.495028 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.494996 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kube-rbac-proxy" containerID="cri-o://7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a" gracePeriod=30 Apr 16 22:58:04.495147 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.494948 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kserve-container" containerID="cri-o://9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b" gracePeriod=30 Apr 16 22:58:04.554778 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.554747 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln"] Apr 16 22:58:04.555126 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555108 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" Apr 16 22:58:04.555211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555129 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" Apr 16 22:58:04.555211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555142 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="storage-initializer" Apr 16 22:58:04.555211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555150 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="storage-initializer" Apr 16 22:58:04.555211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555183 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kube-rbac-proxy" Apr 16 22:58:04.555211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555192 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kube-rbac-proxy" Apr 16 22:58:04.555512 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555267 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kube-rbac-proxy" Apr 16 22:58:04.555512 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.555280 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3d90-c606-4e26-8009-8cb2d1e1586f" containerName="kserve-container" Apr 16 22:58:04.558405 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.558383 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.560690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.560669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 16 22:58:04.560792 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.560688 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 16 22:58:04.567902 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.567870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln"] Apr 16 22:58:04.624004 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.623973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c534b9aa-dbfb-479a-be18-34a1b9e41179-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.624145 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.624009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhp6\" (UniqueName: \"kubernetes.io/projected/c534b9aa-dbfb-479a-be18-34a1b9e41179-kube-api-access-nxhp6\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.624145 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.624070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.624145 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.624099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c534b9aa-dbfb-479a-be18-34a1b9e41179-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.724854 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.724806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c534b9aa-dbfb-479a-be18-34a1b9e41179-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.724854 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.724861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c534b9aa-dbfb-479a-be18-34a1b9e41179-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.725082 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.724889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxhp6\" (UniqueName: \"kubernetes.io/projected/c534b9aa-dbfb-479a-be18-34a1b9e41179-kube-api-access-nxhp6\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.725082 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.724938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.725082 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:04.725050 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-serving-cert: secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 16 22:58:04.725213 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:04.725147 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls podName:c534b9aa-dbfb-479a-be18-34a1b9e41179 nodeName:}" failed. No retries permitted until 2026-04-16 22:58:05.225113015 +0000 UTC m=+2664.349737283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls") pod "isvc-sklearn-runtime-predictor-65cd49579f-dxsln" (UID: "c534b9aa-dbfb-479a-be18-34a1b9e41179") : secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 16 22:58:04.725255 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.725220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c534b9aa-dbfb-479a-be18-34a1b9e41179-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.725601 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.725581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c534b9aa-dbfb-479a-be18-34a1b9e41179-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:04.734594 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:04.734574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxhp6\" (UniqueName: \"kubernetes.io/projected/c534b9aa-dbfb-479a-be18-34a1b9e41179-kube-api-access-nxhp6\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:05.228091 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:05.228055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:05.230550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:05.230532 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-dxsln\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:05.289069 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:05.289033 2576 generic.go:358] "Generic (PLEG): container finished" podID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerID="7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a" exitCode=2 Apr 16 22:58:05.289238 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:05.289085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerDied","Data":"7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a"} Apr 16 22:58:05.469344 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:05.469288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:05.586894 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:05.586860 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln"] Apr 16 22:58:05.590401 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:58:05.590369 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc534b9aa_dbfb_479a_be18_34a1b9e41179.slice/crio-a0eee7145e61caa76548cecb37d23f30fb3fa7ab8dbf44fa5bccfe723ee2309a WatchSource:0}: Error finding container a0eee7145e61caa76548cecb37d23f30fb3fa7ab8dbf44fa5bccfe723ee2309a: Status 404 returned error can't find the container with id a0eee7145e61caa76548cecb37d23f30fb3fa7ab8dbf44fa5bccfe723ee2309a Apr 16 22:58:05.592744 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:05.592724 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:58:06.111998 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:06.111950 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.54:8643/healthz\": dial tcp 10.133.0.54:8643: connect: connection refused" Apr 16 22:58:06.293801 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:06.293759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerStarted","Data":"cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587"} Apr 16 22:58:06.293801 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:06.293797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerStarted","Data":"a0eee7145e61caa76548cecb37d23f30fb3fa7ab8dbf44fa5bccfe723ee2309a"} Apr 16 22:58:07.160516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:07.160473 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.54:8080/v2/models/sklearn-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 16 22:58:11.112293 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.112252 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.54:8643/healthz\": dial tcp 10.133.0.54:8643: connect: connection refused" Apr 16 22:58:11.313155 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.313073 2576 generic.go:358] "Generic (PLEG): container finished" podID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerID="cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587" exitCode=0 Apr 16 22:58:11.313155 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.313145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerDied","Data":"cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587"} Apr 16 22:58:11.839903 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.839882 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:58:11.883273 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.883249 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed14173-a9fa-4a1a-9883-7ca59c34722b-proxy-tls\") pod \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " Apr 16 22:58:11.883474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.883296 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwpfr\" (UniqueName: \"kubernetes.io/projected/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kube-api-access-wwpfr\") pod \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " Apr 16 22:58:11.883474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.883361 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ed14173-a9fa-4a1a-9883-7ca59c34722b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " Apr 16 22:58:11.883474 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.883394 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kserve-provision-location\") pod \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\" (UID: \"9ed14173-a9fa-4a1a-9883-7ca59c34722b\") " Apr 16 22:58:11.883798 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.883763 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed14173-a9fa-4a1a-9883-7ca59c34722b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "9ed14173-a9fa-4a1a-9883-7ca59c34722b" (UID: "9ed14173-a9fa-4a1a-9883-7ca59c34722b"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:58:11.883798 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.883776 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ed14173-a9fa-4a1a-9883-7ca59c34722b" (UID: "9ed14173-a9fa-4a1a-9883-7ca59c34722b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:58:11.885979 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.885954 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed14173-a9fa-4a1a-9883-7ca59c34722b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ed14173-a9fa-4a1a-9883-7ca59c34722b" (UID: "9ed14173-a9fa-4a1a-9883-7ca59c34722b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:58:11.886095 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.886056 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kube-api-access-wwpfr" (OuterVolumeSpecName: "kube-api-access-wwpfr") pod "9ed14173-a9fa-4a1a-9883-7ca59c34722b" (UID: "9ed14173-a9fa-4a1a-9883-7ca59c34722b"). InnerVolumeSpecName "kube-api-access-wwpfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:58:11.985031 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.984999 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed14173-a9fa-4a1a-9883-7ca59c34722b-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:11.985031 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.985029 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwpfr\" (UniqueName: \"kubernetes.io/projected/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kube-api-access-wwpfr\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:11.985228 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.985044 2576 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ed14173-a9fa-4a1a-9883-7ca59c34722b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:11.985228 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:11.985059 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ed14173-a9fa-4a1a-9883-7ca59c34722b-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:12.318380 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.318268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerStarted","Data":"705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3"} Apr 16 22:58:12.318380 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.318315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerStarted","Data":"0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782"} Apr 16 22:58:12.318870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.318793 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:12.318870 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.318820 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:12.320264 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.320234 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 22:58:12.320409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.320261 2576 generic.go:358] "Generic (PLEG): container finished" podID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerID="9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b" exitCode=0 Apr 16 22:58:12.320409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.320349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerDied","Data":"9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b"} Apr 16 22:58:12.320409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.320383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" event={"ID":"9ed14173-a9fa-4a1a-9883-7ca59c34722b","Type":"ContainerDied","Data":"51f80145c556c1bcdc770c806b2a59b70f709d818afcf65c633a2417c854363c"} Apr 16 22:58:12.320409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.320358 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh" Apr 16 22:58:12.320409 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.320396 2576 scope.go:117] "RemoveContainer" containerID="7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a" Apr 16 22:58:12.328782 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.328763 2576 scope.go:117] "RemoveContainer" containerID="9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b" Apr 16 22:58:12.337550 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.337507 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" podStartSLOduration=8.337492514000001 podStartE2EDuration="8.337492514s" podCreationTimestamp="2026-04-16 22:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:58:12.335019909 +0000 UTC m=+2671.459644189" watchObservedRunningTime="2026-04-16 22:58:12.337492514 +0000 UTC m=+2671.462116802" Apr 16 22:58:12.337659 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.337537 2576 scope.go:117] "RemoveContainer" containerID="17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8" Apr 16 22:58:12.344447 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.344429 2576 scope.go:117] "RemoveContainer" containerID="7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a" Apr 16 22:58:12.344706 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:12.344683 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a\": container with ID starting with 7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a not found: ID does not exist" containerID="7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a" Apr 16 22:58:12.344787 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.344722 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a"} err="failed to get container status \"7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a\": rpc error: code = NotFound desc = could not find container \"7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a\": container with ID starting with 7bb42f55349ea3f286480b1a8e61c772e21670d79b92c609ee433b3892d20b5a not found: ID does not exist" Apr 16 22:58:12.344787 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.344746 2576 scope.go:117] "RemoveContainer" containerID="9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b" Apr 16 22:58:12.345006 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:12.344984 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b\": container with ID starting with 9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b not found: ID does not exist" containerID="9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b" Apr 16 22:58:12.345104 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.345009 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b"} err="failed to get container status \"9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b\": rpc error: code = NotFound desc = could not find container \"9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b\": container with ID starting with 9a295eba57b9f6542b4df72d62bbced9e7826d610e081e7900a6c42e42b59a4b not found: ID does not exist" Apr 16 22:58:12.345104 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.345031 2576 scope.go:117] "RemoveContainer" containerID="17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8" Apr 16 22:58:12.345230 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:12.345211 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8\": container with ID starting with 17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8 not found: ID does not exist" containerID="17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8" Apr 16 22:58:12.345269 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.345239 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8"} err="failed to get container status \"17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8\": rpc error: code = NotFound desc = could not find container \"17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8\": container with ID starting with 17c5dd365bf65540541b628a3d02b8b2fd6b563d991bb9a752394454f60ddee8 not found: ID does not exist" Apr 16 22:58:12.348936 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.348907 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh"] Apr 16 22:58:12.351603 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:12.351582 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-f4fdh"] Apr 16 22:58:13.325451 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:13.325412 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 22:58:13.454481 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:13.454441 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" path="/var/lib/kubelet/pods/9ed14173-a9fa-4a1a-9883-7ca59c34722b/volumes" Apr 16 22:58:18.330930 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:18.330898 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:18.331516 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:18.331469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 22:58:28.332197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:28.332166 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:41.515839 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.515764 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-65cd49579f-dxsln_c534b9aa-dbfb-479a-be18-34a1b9e41179/kserve-container/0.log" Apr 16 22:58:41.645849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.645819 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln"] Apr 16 22:58:41.646211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.646153 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kserve-container" containerID="cri-o://0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782" gracePeriod=30 Apr 16 22:58:41.646532 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.646475 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kube-rbac-proxy" containerID="cri-o://705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3" gracePeriod=30 Apr 16 22:58:41.718466 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718429 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472"] Apr 16 22:58:41.718743 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718731 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kserve-container" Apr 16 22:58:41.718796 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718745 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kserve-container" Apr 16 22:58:41.718796 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718752 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kube-rbac-proxy" Apr 16 22:58:41.718796 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718757 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kube-rbac-proxy" Apr 16 22:58:41.718796 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718774 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="storage-initializer" Apr 16 22:58:41.718796 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718782 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="storage-initializer" Apr 16 22:58:41.718968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718843 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kserve-container" Apr 16 22:58:41.718968 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.718850 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ed14173-a9fa-4a1a-9883-7ca59c34722b" containerName="kube-rbac-proxy" Apr 16 22:58:41.722253 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.722233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.724443 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.724420 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 16 22:58:41.724572 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.724554 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 16 22:58:41.730537 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.730516 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472"] Apr 16 22:58:41.811773 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.811692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08e35d76-1d7b-4cf7-9620-7795cd364b64-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.811773 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.811745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08e35d76-1d7b-4cf7-9620-7795cd364b64-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.811955 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.811811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08e35d76-1d7b-4cf7-9620-7795cd364b64-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.811955 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.811860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zmth\" (UniqueName: \"kubernetes.io/projected/08e35d76-1d7b-4cf7-9620-7795cd364b64-kube-api-access-6zmth\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.912646 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.912610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08e35d76-1d7b-4cf7-9620-7795cd364b64-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.912824 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.912664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08e35d76-1d7b-4cf7-9620-7795cd364b64-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.912824 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.912695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zmth\" (UniqueName: \"kubernetes.io/projected/08e35d76-1d7b-4cf7-9620-7795cd364b64-kube-api-access-6zmth\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.912824 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.912729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08e35d76-1d7b-4cf7-9620-7795cd364b64-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.913067 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.913044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08e35d76-1d7b-4cf7-9620-7795cd364b64-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.913359 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.913311 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08e35d76-1d7b-4cf7-9620-7795cd364b64-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.915486 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.915462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08e35d76-1d7b-4cf7-9620-7795cd364b64-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:41.919961 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:41.919941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zmth\" (UniqueName: \"kubernetes.io/projected/08e35d76-1d7b-4cf7-9620-7795cd364b64-kube-api-access-6zmth\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:42.033211 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.033181 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:42.159204 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.159058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472"] Apr 16 22:58:42.162001 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:58:42.161979 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e35d76_1d7b_4cf7_9620_7795cd364b64.slice/crio-9f17f6bee02a07a4261451ac98567fcc330f880059ac2d85a8e7d9ef150e8888 WatchSource:0}: Error finding container 9f17f6bee02a07a4261451ac98567fcc330f880059ac2d85a8e7d9ef150e8888: Status 404 returned error can't find the container with id 9f17f6bee02a07a4261451ac98567fcc330f880059ac2d85a8e7d9ef150e8888 Apr 16 22:58:42.419017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.418918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerStarted","Data":"fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c"} Apr 16 22:58:42.419017 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.418962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerStarted","Data":"9f17f6bee02a07a4261451ac98567fcc330f880059ac2d85a8e7d9ef150e8888"} Apr 16 22:58:42.421103 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.421076 2576 generic.go:358] "Generic (PLEG): container finished" podID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerID="705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3" exitCode=2 Apr 16 22:58:42.421233 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.421131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerDied","Data":"705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3"} Apr 16 22:58:42.602560 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.602536 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:42.721132 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.721100 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls\") pod \"c534b9aa-dbfb-479a-be18-34a1b9e41179\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " Apr 16 22:58:42.721309 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.721142 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c534b9aa-dbfb-479a-be18-34a1b9e41179-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"c534b9aa-dbfb-479a-be18-34a1b9e41179\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " Apr 16 22:58:42.721309 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.721170 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c534b9aa-dbfb-479a-be18-34a1b9e41179-kserve-provision-location\") pod \"c534b9aa-dbfb-479a-be18-34a1b9e41179\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " Apr 16 22:58:42.721309 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.721204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxhp6\" (UniqueName: \"kubernetes.io/projected/c534b9aa-dbfb-479a-be18-34a1b9e41179-kube-api-access-nxhp6\") pod \"c534b9aa-dbfb-479a-be18-34a1b9e41179\" (UID: \"c534b9aa-dbfb-479a-be18-34a1b9e41179\") " Apr 16 22:58:42.721564 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.721540 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c534b9aa-dbfb-479a-be18-34a1b9e41179-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "c534b9aa-dbfb-479a-be18-34a1b9e41179" (UID: "c534b9aa-dbfb-479a-be18-34a1b9e41179"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:58:42.723298 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.723266 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c534b9aa-dbfb-479a-be18-34a1b9e41179" (UID: "c534b9aa-dbfb-479a-be18-34a1b9e41179"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:58:42.723519 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.723491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c534b9aa-dbfb-479a-be18-34a1b9e41179-kube-api-access-nxhp6" (OuterVolumeSpecName: "kube-api-access-nxhp6") pod "c534b9aa-dbfb-479a-be18-34a1b9e41179" (UID: "c534b9aa-dbfb-479a-be18-34a1b9e41179"). InnerVolumeSpecName "kube-api-access-nxhp6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:58:42.743277 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.743241 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c534b9aa-dbfb-479a-be18-34a1b9e41179-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c534b9aa-dbfb-479a-be18-34a1b9e41179" (UID: "c534b9aa-dbfb-479a-be18-34a1b9e41179"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:58:42.822156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.822120 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxhp6\" (UniqueName: \"kubernetes.io/projected/c534b9aa-dbfb-479a-be18-34a1b9e41179-kube-api-access-nxhp6\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:42.822156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.822150 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c534b9aa-dbfb-479a-be18-34a1b9e41179-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:42.822156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.822160 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c534b9aa-dbfb-479a-be18-34a1b9e41179-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:42.822410 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:42.822173 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c534b9aa-dbfb-479a-be18-34a1b9e41179-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:58:43.426197 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.426165 2576 generic.go:358] "Generic (PLEG): container finished" podID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerID="0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782" exitCode=0 Apr 16 22:58:43.426404 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.426238 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" Apr 16 22:58:43.426404 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.426254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerDied","Data":"0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782"} Apr 16 22:58:43.426404 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.426296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln" event={"ID":"c534b9aa-dbfb-479a-be18-34a1b9e41179","Type":"ContainerDied","Data":"a0eee7145e61caa76548cecb37d23f30fb3fa7ab8dbf44fa5bccfe723ee2309a"} Apr 16 22:58:43.426404 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.426313 2576 scope.go:117] "RemoveContainer" containerID="705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3" Apr 16 22:58:43.435180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.435161 2576 scope.go:117] "RemoveContainer" containerID="0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782" Apr 16 22:58:43.442254 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.442237 2576 scope.go:117] "RemoveContainer" containerID="cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587" Apr 16 22:58:43.446990 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.446966 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln"] Apr 16 22:58:43.450079 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.450055 2576 scope.go:117] "RemoveContainer" containerID="705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3" Apr 16 22:58:43.450432 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:43.450405 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3\": container with ID starting with 705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3 not found: ID does not exist" containerID="705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3" Apr 16 22:58:43.450515 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.450446 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3"} err="failed to get container status \"705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3\": rpc error: code = NotFound desc = could not find container \"705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3\": container with ID starting with 705350ee08914c2a90f8f4eac50fcc41bf0a4c26bbb363db2a4b84e225080ab3 not found: ID does not exist" Apr 16 22:58:43.450515 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.450466 2576 scope.go:117] "RemoveContainer" containerID="0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782" Apr 16 22:58:43.450776 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:43.450748 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782\": container with ID starting with 0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782 not found: ID does not exist" containerID="0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782" Apr 16 22:58:43.450819 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.450784 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782"} err="failed to get container status \"0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782\": rpc error: code = NotFound desc = could not find container \"0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782\": container with ID starting with 0202be16c407372b19a3407ecc9c933eec2aa3a19d671ec53d42550c6b433782 not found: ID does not exist" Apr 16 22:58:43.450819 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.450803 2576 scope.go:117] "RemoveContainer" containerID="cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587" Apr 16 22:58:43.451072 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:58:43.451047 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587\": container with ID starting with cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587 not found: ID does not exist" containerID="cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587" Apr 16 22:58:43.451118 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.451085 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587"} err="failed to get container status \"cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587\": rpc error: code = NotFound desc = could not find container \"cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587\": container with ID starting with cec65b28868a089950edbc88406d2ba8913f91d6e54f5ee1543cd770bc902587 not found: ID does not exist" Apr 16 22:58:43.452994 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:43.452969 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-dxsln"] Apr 16 22:58:45.452154 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:45.452117 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" path="/var/lib/kubelet/pods/c534b9aa-dbfb-479a-be18-34a1b9e41179/volumes" Apr 16 22:58:46.437998 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:46.437967 2576 generic.go:358] "Generic (PLEG): container finished" podID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerID="fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c" exitCode=0 Apr 16 22:58:46.438156 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:46.438014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerDied","Data":"fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c"} Apr 16 22:58:47.444254 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:47.444217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerStarted","Data":"c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840"} Apr 16 22:58:47.444254 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:47.444259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerStarted","Data":"ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47"} Apr 16 22:58:47.444814 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:47.444515 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:47.444814 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:47.444577 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:58:47.463337 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:47.463283 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" podStartSLOduration=6.46327146 podStartE2EDuration="6.46327146s" podCreationTimestamp="2026-04-16 22:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:58:47.461207917 +0000 UTC m=+2706.585832216" watchObservedRunningTime="2026-04-16 22:58:47.46327146 +0000 UTC m=+2706.587895747" Apr 16 22:58:53.453038 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:58:53.453008 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:59:23.537376 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:23.537331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 22:59:33.456446 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:33.456417 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:59:41.807511 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.807477 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472"] Apr 16 22:59:41.807901 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.807875 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kserve-container" containerID="cri-o://ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47" gracePeriod=30 Apr 16 22:59:41.808020 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.807966 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kube-rbac-proxy" containerID="cri-o://c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840" gracePeriod=30 Apr 16 22:59:41.884110 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884079 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl"] Apr 16 22:59:41.884526 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884508 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kube-rbac-proxy" Apr 16 22:59:41.884526 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884528 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kube-rbac-proxy" Apr 16 22:59:41.884690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884553 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="storage-initializer" Apr 16 22:59:41.884690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884562 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="storage-initializer" Apr 16 22:59:41.884690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884579 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kserve-container" Apr 16 22:59:41.884690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884589 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kserve-container" Apr 16 22:59:41.884690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884670 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kube-rbac-proxy" Apr 16 22:59:41.884690 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.884686 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c534b9aa-dbfb-479a-be18-34a1b9e41179" containerName="kserve-container" Apr 16 22:59:41.887819 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.887797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:41.890180 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.890159 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 16 22:59:41.890466 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.890178 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 16 22:59:41.897023 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.897000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl"] Apr 16 22:59:41.992980 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.992947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpj4t\" (UniqueName: \"kubernetes.io/projected/c0be1175-0da7-4058-9545-a4bff8c698fa-kube-api-access-xpj4t\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:41.993148 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.993001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0be1175-0da7-4058-9545-a4bff8c698fa-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:41.993148 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.993024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0be1175-0da7-4058-9545-a4bff8c698fa-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:41.993148 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:41.993097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0be1175-0da7-4058-9545-a4bff8c698fa-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.094046 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.093965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpj4t\" (UniqueName: \"kubernetes.io/projected/c0be1175-0da7-4058-9545-a4bff8c698fa-kube-api-access-xpj4t\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.094189 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.094044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0be1175-0da7-4058-9545-a4bff8c698fa-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.094189 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.094073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0be1175-0da7-4058-9545-a4bff8c698fa-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.094189 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.094113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0be1175-0da7-4058-9545-a4bff8c698fa-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.094528 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.094506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0be1175-0da7-4058-9545-a4bff8c698fa-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.094849 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.094816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0be1175-0da7-4058-9545-a4bff8c698fa-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.096678 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.096663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0be1175-0da7-4058-9545-a4bff8c698fa-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.101841 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.101810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpj4t\" (UniqueName: \"kubernetes.io/projected/c0be1175-0da7-4058-9545-a4bff8c698fa-kube-api-access-xpj4t\") pod \"isvc-sklearn-v2-predictor-69755fbb9-kh7dl\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.199479 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.199438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:42.323074 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.323048 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl"] Apr 16 22:59:42.325502 ip-10-0-133-183 kubenswrapper[2576]: W0416 22:59:42.325476 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0be1175_0da7_4058_9545_a4bff8c698fa.slice/crio-8315ad779b3a915f04ef966bb1c116a266e759c5f0358895569f70d5e082eb7f WatchSource:0}: Error finding container 8315ad779b3a915f04ef966bb1c116a266e759c5f0358895569f70d5e082eb7f: Status 404 returned error can't find the container with id 8315ad779b3a915f04ef966bb1c116a266e759c5f0358895569f70d5e082eb7f Apr 16 22:59:42.621671 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.621577 2576 generic.go:358] "Generic (PLEG): container finished" podID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerID="c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840" exitCode=2 Apr 16 22:59:42.621817 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.621657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerDied","Data":"c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840"} Apr 16 22:59:42.622877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.622846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerStarted","Data":"c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2"} Apr 16 22:59:42.622877 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:42.622877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerStarted","Data":"8315ad779b3a915f04ef966bb1c116a266e759c5f0358895569f70d5e082eb7f"} Apr 16 22:59:43.448475 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:43.448427 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.56:8643/healthz\": dial tcp 10.133.0.56:8643: connect: connection refused" Apr 16 22:59:44.494540 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:44.494486 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.56:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 16 22:59:46.637616 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:46.637579 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerID="c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2" exitCode=0 Apr 16 22:59:46.637977 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:46.637654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerDied","Data":"c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2"} Apr 16 22:59:47.646437 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:47.646396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerStarted","Data":"8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777"} Apr 16 22:59:47.646437 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:47.646440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerStarted","Data":"b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3"} Apr 16 22:59:47.646917 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:47.646735 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:47.646917 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:47.646855 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:47.647835 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:47.647810 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 22:59:47.664090 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:47.664053 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podStartSLOduration=6.664041521 podStartE2EDuration="6.664041521s" podCreationTimestamp="2026-04-16 22:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:59:47.662532513 +0000 UTC m=+2766.787156801" watchObservedRunningTime="2026-04-16 22:59:47.664041521 +0000 UTC m=+2766.788665808" Apr 16 22:59:48.448489 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:48.448437 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.56:8643/healthz\": dial tcp 10.133.0.56:8643: connect: connection refused" Apr 16 22:59:48.649987 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:48.649940 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 22:59:49.250992 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.250971 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:59:49.353131 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.353040 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zmth\" (UniqueName: \"kubernetes.io/projected/08e35d76-1d7b-4cf7-9620-7795cd364b64-kube-api-access-6zmth\") pod \"08e35d76-1d7b-4cf7-9620-7795cd364b64\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " Apr 16 22:59:49.353131 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.353107 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08e35d76-1d7b-4cf7-9620-7795cd364b64-proxy-tls\") pod \"08e35d76-1d7b-4cf7-9620-7795cd364b64\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " Apr 16 22:59:49.353378 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.353147 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08e35d76-1d7b-4cf7-9620-7795cd364b64-kserve-provision-location\") pod \"08e35d76-1d7b-4cf7-9620-7795cd364b64\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " Apr 16 22:59:49.353378 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.353167 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08e35d76-1d7b-4cf7-9620-7795cd364b64-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"08e35d76-1d7b-4cf7-9620-7795cd364b64\" (UID: \"08e35d76-1d7b-4cf7-9620-7795cd364b64\") " Apr 16 22:59:49.353500 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.353468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08e35d76-1d7b-4cf7-9620-7795cd364b64-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08e35d76-1d7b-4cf7-9620-7795cd364b64" (UID: "08e35d76-1d7b-4cf7-9620-7795cd364b64"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:59:49.353568 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.353541 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08e35d76-1d7b-4cf7-9620-7795cd364b64-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "08e35d76-1d7b-4cf7-9620-7795cd364b64" (UID: "08e35d76-1d7b-4cf7-9620-7795cd364b64"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:59:49.355173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.355150 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e35d76-1d7b-4cf7-9620-7795cd364b64-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "08e35d76-1d7b-4cf7-9620-7795cd364b64" (UID: "08e35d76-1d7b-4cf7-9620-7795cd364b64"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:59:49.355173 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.355162 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e35d76-1d7b-4cf7-9620-7795cd364b64-kube-api-access-6zmth" (OuterVolumeSpecName: "kube-api-access-6zmth") pod "08e35d76-1d7b-4cf7-9620-7795cd364b64" (UID: "08e35d76-1d7b-4cf7-9620-7795cd364b64"). InnerVolumeSpecName "kube-api-access-6zmth". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:59:49.453899 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.453872 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08e35d76-1d7b-4cf7-9620-7795cd364b64-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:59:49.453899 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.453895 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/08e35d76-1d7b-4cf7-9620-7795cd364b64-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:59:49.454082 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.453906 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zmth\" (UniqueName: \"kubernetes.io/projected/08e35d76-1d7b-4cf7-9620-7795cd364b64-kube-api-access-6zmth\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:59:49.454082 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.453915 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08e35d76-1d7b-4cf7-9620-7795cd364b64-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 22:59:49.654478 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.654392 2576 generic.go:358] "Generic (PLEG): container finished" podID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerID="ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47" exitCode=0 Apr 16 22:59:49.654861 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.654483 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" Apr 16 22:59:49.654861 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.654473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerDied","Data":"ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47"} Apr 16 22:59:49.654861 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.654589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472" event={"ID":"08e35d76-1d7b-4cf7-9620-7795cd364b64","Type":"ContainerDied","Data":"9f17f6bee02a07a4261451ac98567fcc330f880059ac2d85a8e7d9ef150e8888"} Apr 16 22:59:49.654861 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.654611 2576 scope.go:117] "RemoveContainer" containerID="c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840" Apr 16 22:59:49.662411 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.662394 2576 scope.go:117] "RemoveContainer" containerID="ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47" Apr 16 22:59:49.668920 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.668855 2576 scope.go:117] "RemoveContainer" containerID="fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c" Apr 16 22:59:49.670873 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.670853 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472"] Apr 16 22:59:49.675960 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.675927 2576 scope.go:117] "RemoveContainer" containerID="c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840" Apr 16 22:59:49.676210 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:59:49.676190 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840\": container with ID starting with c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840 not found: ID does not exist" containerID="c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840" Apr 16 22:59:49.676299 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.676223 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840"} err="failed to get container status \"c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840\": rpc error: code = NotFound desc = could not find container \"c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840\": container with ID starting with c7b48170b4f79d0626c12e4b3383f1181d37ce7c0dd3a4b4ec2ba860ca7d0840 not found: ID does not exist" Apr 16 22:59:49.676299 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.676248 2576 scope.go:117] "RemoveContainer" containerID="ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47" Apr 16 22:59:49.676420 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.676305 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-2r472"] Apr 16 22:59:49.676607 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:59:49.676588 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47\": container with ID starting with ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47 not found: ID does not exist" containerID="ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47" Apr 16 22:59:49.676651 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.676614 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47"} err="failed to get container status \"ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47\": rpc error: code = NotFound desc = could not find container \"ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47\": container with ID starting with ab38fb581e84ba4f857d7ccaf1a46c34d6619f0d8dbf9c63659f84c692999f47 not found: ID does not exist" Apr 16 22:59:49.676651 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.676629 2576 scope.go:117] "RemoveContainer" containerID="fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c" Apr 16 22:59:49.676850 ip-10-0-133-183 kubenswrapper[2576]: E0416 22:59:49.676827 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c\": container with ID starting with fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c not found: ID does not exist" containerID="fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c" Apr 16 22:59:49.676906 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:49.676859 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c"} err="failed to get container status \"fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c\": rpc error: code = NotFound desc = could not find container \"fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c\": container with ID starting with fe11deda917924f2eea3bb44e5ddc33a2a5d3da0a5a7a022c2b906c3d5170f4c not found: ID does not exist" Apr 16 22:59:51.452770 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:51.452736 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" path="/var/lib/kubelet/pods/08e35d76-1d7b-4cf7-9620-7795cd364b64/volumes" Apr 16 22:59:53.654261 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:53.654234 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 22:59:53.654938 ip-10-0-133-183 kubenswrapper[2576]: I0416 22:59:53.654905 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 23:00:03.655112 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:00:03.655018 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 23:00:13.655556 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:00:13.655517 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 23:00:23.654821 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:00:23.654783 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 23:00:33.655578 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:00:33.655537 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 23:00:43.655340 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:00:43.655292 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 23:00:53.656318 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:00:53.656287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 23:01:02.055625 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.055590 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl"] Apr 16 23:01:02.056104 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.055981 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" containerID="cri-o://b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3" gracePeriod=30 Apr 16 23:01:02.056104 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.056003 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kube-rbac-proxy" containerID="cri-o://8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777" gracePeriod=30 Apr 16 23:01:02.132749 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.132711 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz"] Apr 16 23:01:02.133066 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133053 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kube-rbac-proxy" Apr 16 23:01:02.133066 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133067 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kube-rbac-proxy" Apr 16 23:01:02.133161 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133086 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="storage-initializer" Apr 16 23:01:02.133161 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133092 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="storage-initializer" Apr 16 23:01:02.133161 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133101 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kserve-container" Apr 16 23:01:02.133161 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133106 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kserve-container" Apr 16 23:01:02.133161 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133151 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kserve-container" Apr 16 23:01:02.133161 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.133159 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="08e35d76-1d7b-4cf7-9620-7795cd364b64" containerName="kube-rbac-proxy" Apr 16 23:01:02.136319 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.136299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.138938 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.138918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 16 23:01:02.139106 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.139090 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 16 23:01:02.145271 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.145247 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz"] Apr 16 23:01:02.219257 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.219222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.219444 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.219298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc2s2\" (UniqueName: \"kubernetes.io/projected/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kube-api-access-tc2s2\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.219444 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.219341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.219444 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.219363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.320354 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.320253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.320354 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.320347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc2s2\" (UniqueName: \"kubernetes.io/projected/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kube-api-access-tc2s2\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.320581 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.320375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.320581 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.320396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.320870 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.320844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.320994 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.320939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.322796 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.322772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.329496 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.329474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc2s2\" (UniqueName: \"kubernetes.io/projected/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kube-api-access-tc2s2\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.447735 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.447679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:02.585251 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.585178 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz"] Apr 16 23:01:02.588698 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:01:02.588668 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a82ae5_08d1_4b05_9bce_d838f6a2bc65.slice/crio-ee8223ca47ac6e23227c9e213b37af95cab8e36d1426cd879f92bca49d1bae42 WatchSource:0}: Error finding container ee8223ca47ac6e23227c9e213b37af95cab8e36d1426cd879f92bca49d1bae42: Status 404 returned error can't find the container with id ee8223ca47ac6e23227c9e213b37af95cab8e36d1426cd879f92bca49d1bae42 Apr 16 23:01:02.888017 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.887935 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerID="8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777" exitCode=2 Apr 16 23:01:02.888168 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.888010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerDied","Data":"8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777"} Apr 16 23:01:02.889284 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.889261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerStarted","Data":"e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b"} Apr 16 23:01:02.889425 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:02.889290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerStarted","Data":"ee8223ca47ac6e23227c9e213b37af95cab8e36d1426cd879f92bca49d1bae42"} Apr 16 23:01:03.650589 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:03.650542 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.57:8643/healthz\": dial tcp 10.133.0.57:8643: connect: connection refused" Apr 16 23:01:03.654893 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:03.654866 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 23:01:06.107167 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.107143 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 23:01:06.155573 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.155541 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0be1175-0da7-4058-9545-a4bff8c698fa-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"c0be1175-0da7-4058-9545-a4bff8c698fa\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " Apr 16 23:01:06.155756 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.155604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0be1175-0da7-4058-9545-a4bff8c698fa-kserve-provision-location\") pod \"c0be1175-0da7-4058-9545-a4bff8c698fa\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " Apr 16 23:01:06.155756 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.155624 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0be1175-0da7-4058-9545-a4bff8c698fa-proxy-tls\") pod \"c0be1175-0da7-4058-9545-a4bff8c698fa\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " Apr 16 23:01:06.155756 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.155643 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpj4t\" (UniqueName: \"kubernetes.io/projected/c0be1175-0da7-4058-9545-a4bff8c698fa-kube-api-access-xpj4t\") pod \"c0be1175-0da7-4058-9545-a4bff8c698fa\" (UID: \"c0be1175-0da7-4058-9545-a4bff8c698fa\") " Apr 16 23:01:06.155986 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.155959 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0be1175-0da7-4058-9545-a4bff8c698fa-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "c0be1175-0da7-4058-9545-a4bff8c698fa" (UID: "c0be1175-0da7-4058-9545-a4bff8c698fa"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:01:06.156043 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.155970 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0be1175-0da7-4058-9545-a4bff8c698fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c0be1175-0da7-4058-9545-a4bff8c698fa" (UID: "c0be1175-0da7-4058-9545-a4bff8c698fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:01:06.157708 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.157683 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0be1175-0da7-4058-9545-a4bff8c698fa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c0be1175-0da7-4058-9545-a4bff8c698fa" (UID: "c0be1175-0da7-4058-9545-a4bff8c698fa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:01:06.157820 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.157774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0be1175-0da7-4058-9545-a4bff8c698fa-kube-api-access-xpj4t" (OuterVolumeSpecName: "kube-api-access-xpj4t") pod "c0be1175-0da7-4058-9545-a4bff8c698fa" (UID: "c0be1175-0da7-4058-9545-a4bff8c698fa"). InnerVolumeSpecName "kube-api-access-xpj4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:01:06.256587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.256500 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0be1175-0da7-4058-9545-a4bff8c698fa-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:01:06.256587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.256533 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0be1175-0da7-4058-9545-a4bff8c698fa-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:01:06.256587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.256544 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0be1175-0da7-4058-9545-a4bff8c698fa-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:01:06.256587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.256555 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpj4t\" (UniqueName: \"kubernetes.io/projected/c0be1175-0da7-4058-9545-a4bff8c698fa-kube-api-access-xpj4t\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:01:06.904383 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.904269 2576 generic.go:358] "Generic (PLEG): container finished" podID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerID="e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b" exitCode=0 Apr 16 23:01:06.904383 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.904364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerDied","Data":"e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b"} Apr 16 23:01:06.906006 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.905984 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerID="b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3" exitCode=0 Apr 16 23:01:06.906124 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.906038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerDied","Data":"b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3"} Apr 16 23:01:06.906124 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.906053 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" Apr 16 23:01:06.906124 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.906072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl" event={"ID":"c0be1175-0da7-4058-9545-a4bff8c698fa","Type":"ContainerDied","Data":"8315ad779b3a915f04ef966bb1c116a266e759c5f0358895569f70d5e082eb7f"} Apr 16 23:01:06.906124 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.906094 2576 scope.go:117] "RemoveContainer" containerID="8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777" Apr 16 23:01:06.917571 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.917553 2576 scope.go:117] "RemoveContainer" containerID="b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3" Apr 16 23:01:06.925470 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.925451 2576 scope.go:117] "RemoveContainer" containerID="c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2" Apr 16 23:01:06.933474 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.933441 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl"] Apr 16 23:01:06.933581 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.933549 2576 scope.go:117] "RemoveContainer" containerID="8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777" Apr 16 23:01:06.933911 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:01:06.933871 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777\": container with ID starting with 8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777 not found: ID does not exist" containerID="8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777" Apr 16 23:01:06.934009 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.933910 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777"} err="failed to get container status \"8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777\": rpc error: code = NotFound desc = could not find container \"8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777\": container with ID starting with 8adc559315137a0b8a986c94de47b695350a6043b937eb27b1edd6fb33fde777 not found: ID does not exist" Apr 16 23:01:06.934009 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.933936 2576 scope.go:117] "RemoveContainer" containerID="b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3" Apr 16 23:01:06.934240 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:01:06.934212 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3\": container with ID starting with b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3 not found: ID does not exist" containerID="b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3" Apr 16 23:01:06.934352 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.934244 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3"} err="failed to get container status \"b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3\": rpc error: code = NotFound desc = could not find container \"b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3\": container with ID starting with b9cd8fdabbd8a37d7f02e055ed97a40c5c08b9899772a866cf12baac52712fa3 not found: ID does not exist" Apr 16 23:01:06.934352 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.934265 2576 scope.go:117] "RemoveContainer" containerID="c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2" Apr 16 23:01:06.934695 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:01:06.934673 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2\": container with ID starting with c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2 not found: ID does not exist" containerID="c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2" Apr 16 23:01:06.934768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.934702 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2"} err="failed to get container status \"c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2\": rpc error: code = NotFound desc = could not find container \"c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2\": container with ID starting with c8db76acef885e5561b10a010a6596707f97f2f2528d2b768d507450f9675de2 not found: ID does not exist" Apr 16 23:01:06.935262 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:06.935246 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-kh7dl"] Apr 16 23:01:07.453394 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:07.453360 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" path="/var/lib/kubelet/pods/c0be1175-0da7-4058-9545-a4bff8c698fa/volumes" Apr 16 23:01:07.911533 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:07.911452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerStarted","Data":"61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a"} Apr 16 23:01:07.911533 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:07.911487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerStarted","Data":"89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8"} Apr 16 23:01:07.911719 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:07.911684 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:07.932632 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:07.932588 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podStartSLOduration=5.932574971 podStartE2EDuration="5.932574971s" podCreationTimestamp="2026-04-16 23:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:01:07.930833684 +0000 UTC m=+2847.055457971" watchObservedRunningTime="2026-04-16 23:01:07.932574971 +0000 UTC m=+2847.057199258" Apr 16 23:01:08.914602 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:08.914572 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:08.915894 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:08.915866 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:01:09.918203 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:09.918149 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:01:14.922986 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:14.922953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:01:14.923448 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:14.923424 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:01:24.923809 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:24.923762 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:01:34.923451 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:34.923363 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:01:44.923461 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:44.923413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:01:54.923406 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:01:54.923368 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:02:04.923512 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:04.923470 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:02:14.924014 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:14.923986 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:02:22.243545 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.243511 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz"] Apr 16 23:02:22.244892 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.244832 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" containerID="cri-o://89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8" gracePeriod=30 Apr 16 23:02:22.245025 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.244901 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kube-rbac-proxy" containerID="cri-o://61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a" gracePeriod=30 Apr 16 23:02:22.313466 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313427 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8"] Apr 16 23:02:22.313793 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313780 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kube-rbac-proxy" Apr 16 23:02:22.313847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313795 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kube-rbac-proxy" Apr 16 23:02:22.313847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313812 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" Apr 16 23:02:22.313847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313817 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" Apr 16 23:02:22.313847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313824 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="storage-initializer" Apr 16 23:02:22.313847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313831 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="storage-initializer" Apr 16 23:02:22.314019 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313878 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kube-rbac-proxy" Apr 16 23:02:22.314019 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.313888 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0be1175-0da7-4058-9545-a4bff8c698fa" containerName="kserve-container" Apr 16 23:02:22.317234 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.317212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.319572 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.319552 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 16 23:02:22.319692 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.319558 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 16 23:02:22.327044 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.327020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8"] Apr 16 23:02:22.364936 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.364913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpm2n\" (UniqueName: \"kubernetes.io/projected/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kube-api-access-qpm2n\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.365037 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.364949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.365037 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.364988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.365037 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.365024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.466194 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.466158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.466409 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.466209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.466409 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.466258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpm2n\" (UniqueName: \"kubernetes.io/projected/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kube-api-access-qpm2n\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.466409 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.466281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.466632 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.466613 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.466867 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.466836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.468847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.468826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.476394 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.476372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpm2n\" (UniqueName: \"kubernetes.io/projected/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kube-api-access-qpm2n\") pod \"isvc-tensorflow-predictor-6756f669d7-gwwz8\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.628622 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.628520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:22.754009 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:22.753981 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8"] Apr 16 23:02:22.756727 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:02:22.756694 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ca7ef85_5e5b_48e0_bdff_b109f84e9d8e.slice/crio-8fc0821ce7b1ba5f37159f387701fb2631206aaf02b83850899621e4a684a271 WatchSource:0}: Error finding container 8fc0821ce7b1ba5f37159f387701fb2631206aaf02b83850899621e4a684a271: Status 404 returned error can't find the container with id 8fc0821ce7b1ba5f37159f387701fb2631206aaf02b83850899621e4a684a271 Apr 16 23:02:23.146869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:23.146827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerStarted","Data":"3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655"} Apr 16 23:02:23.146869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:23.146873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerStarted","Data":"8fc0821ce7b1ba5f37159f387701fb2631206aaf02b83850899621e4a684a271"} Apr 16 23:02:23.148762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:23.148737 2576 generic.go:358] "Generic (PLEG): container finished" podID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerID="61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a" exitCode=2 Apr 16 23:02:23.148893 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:23.148798 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerDied","Data":"61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a"} Apr 16 23:02:24.918945 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:24.918901 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.58:8643/healthz\": dial tcp 10.133.0.58:8643: connect: connection refused" Apr 16 23:02:24.923716 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:24.923680 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 23:02:26.686712 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.686687 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:02:26.802123 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.802033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-proxy-tls\") pod \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " Apr 16 23:02:26.802123 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.802097 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kserve-provision-location\") pod \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " Apr 16 23:02:26.802399 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.802156 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc2s2\" (UniqueName: \"kubernetes.io/projected/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kube-api-access-tc2s2\") pod \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " Apr 16 23:02:26.802399 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.802187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\" (UID: \"83a82ae5-08d1-4b05-9bce-d838f6a2bc65\") " Apr 16 23:02:26.802524 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.802481 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83a82ae5-08d1-4b05-9bce-d838f6a2bc65" (UID: "83a82ae5-08d1-4b05-9bce-d838f6a2bc65"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:02:26.802647 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.802517 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "83a82ae5-08d1-4b05-9bce-d838f6a2bc65" (UID: "83a82ae5-08d1-4b05-9bce-d838f6a2bc65"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:02:26.804249 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.804228 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "83a82ae5-08d1-4b05-9bce-d838f6a2bc65" (UID: "83a82ae5-08d1-4b05-9bce-d838f6a2bc65"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:02:26.804362 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.804240 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kube-api-access-tc2s2" (OuterVolumeSpecName: "kube-api-access-tc2s2") pod "83a82ae5-08d1-4b05-9bce-d838f6a2bc65" (UID: "83a82ae5-08d1-4b05-9bce-d838f6a2bc65"). InnerVolumeSpecName "kube-api-access-tc2s2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:02:26.902938 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.902902 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tc2s2\" (UniqueName: \"kubernetes.io/projected/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kube-api-access-tc2s2\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:02:26.902938 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.902933 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:02:26.902938 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.902949 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:02:26.903167 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:26.902957 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a82ae5-08d1-4b05-9bce-d838f6a2bc65-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:02:27.162745 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.162664 2576 generic.go:358] "Generic (PLEG): container finished" podID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerID="89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8" exitCode=0 Apr 16 23:02:27.162745 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.162704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerDied","Data":"89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8"} Apr 16 23:02:27.162745 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.162732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" event={"ID":"83a82ae5-08d1-4b05-9bce-d838f6a2bc65","Type":"ContainerDied","Data":"ee8223ca47ac6e23227c9e213b37af95cab8e36d1426cd879f92bca49d1bae42"} Apr 16 23:02:27.162745 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.162747 2576 scope.go:117] "RemoveContainer" containerID="61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a" Apr 16 23:02:27.163021 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.162745 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz" Apr 16 23:02:27.171445 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.171427 2576 scope.go:117] "RemoveContainer" containerID="89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8" Apr 16 23:02:27.178933 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.178912 2576 scope.go:117] "RemoveContainer" containerID="e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b" Apr 16 23:02:27.185053 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.185025 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz"] Apr 16 23:02:27.187087 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.187069 2576 scope.go:117] "RemoveContainer" containerID="61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a" Apr 16 23:02:27.187504 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:02:27.187482 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a\": container with ID starting with 61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a not found: ID does not exist" containerID="61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a" Apr 16 23:02:27.187595 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.187512 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a"} err="failed to get container status \"61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a\": rpc error: code = NotFound desc = could not find container \"61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a\": container with ID starting with 61a11401a808a367c9f1238af257aa520568cab8283da972d65d4590577ab48a not found: ID does not exist" Apr 16 23:02:27.187595 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.187531 2576 scope.go:117] "RemoveContainer" containerID="89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8" Apr 16 23:02:27.187839 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:02:27.187776 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8\": container with ID starting with 89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8 not found: ID does not exist" containerID="89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8" Apr 16 23:02:27.188408 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.187875 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8"} err="failed to get container status \"89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8\": rpc error: code = NotFound desc = could not find container \"89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8\": container with ID starting with 89ccf6af78e45cbd790aacbd69939c0d7508a90a642b274536301c4934fde9b8 not found: ID does not exist" Apr 16 23:02:27.188408 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.188428 2576 scope.go:117] "RemoveContainer" containerID="e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b" Apr 16 23:02:27.188724 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:02:27.188696 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b\": container with ID starting with e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b not found: ID does not exist" containerID="e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b" Apr 16 23:02:27.188827 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.188733 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b"} err="failed to get container status \"e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b\": rpc error: code = NotFound desc = could not find container \"e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b\": container with ID starting with e591efa33f18df3d92bfe132827e35bea8f814cf7f594c0f777ef862d7e9128b not found: ID does not exist" Apr 16 23:02:27.189637 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.189617 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-82fxz"] Apr 16 23:02:27.457852 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:27.457819 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" path="/var/lib/kubelet/pods/83a82ae5-08d1-4b05-9bce-d838f6a2bc65/volumes" Apr 16 23:02:28.166719 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:28.166686 2576 generic.go:358] "Generic (PLEG): container finished" podID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerID="3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655" exitCode=0 Apr 16 23:02:28.167205 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:28.166769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerDied","Data":"3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655"} Apr 16 23:02:32.185541 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:32.185457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerStarted","Data":"6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64"} Apr 16 23:02:32.185541 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:32.185497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerStarted","Data":"a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0"} Apr 16 23:02:32.185931 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:32.185773 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:32.185931 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:32.185912 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:32.187204 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:32.187176 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 23:02:32.202878 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:32.202833 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podStartSLOduration=6.655889863 podStartE2EDuration="10.202819165s" podCreationTimestamp="2026-04-16 23:02:22 +0000 UTC" firstStartedPulling="2026-04-16 23:02:28.168105104 +0000 UTC m=+2927.292729370" lastFinishedPulling="2026-04-16 23:02:31.715034405 +0000 UTC m=+2930.839658672" observedRunningTime="2026-04-16 23:02:32.201771544 +0000 UTC m=+2931.326395832" watchObservedRunningTime="2026-04-16 23:02:32.202819165 +0000 UTC m=+2931.327443451" Apr 16 23:02:33.188576 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:33.188530 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 23:02:38.193609 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:38.193582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:02:38.194092 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:38.194021 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 23:02:48.195001 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:02:48.194974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:03:03.274401 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274310 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd"] Apr 16 23:03:03.274738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274643 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="storage-initializer" Apr 16 23:03:03.274738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274654 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="storage-initializer" Apr 16 23:03:03.274738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274662 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" Apr 16 23:03:03.274738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274668 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" Apr 16 23:03:03.274738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274687 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kube-rbac-proxy" Apr 16 23:03:03.274738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274693 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kube-rbac-proxy" Apr 16 23:03:03.274738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274734 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kserve-container" Apr 16 23:03:03.274966 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.274745 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="83a82ae5-08d1-4b05-9bce-d838f6a2bc65" containerName="kube-rbac-proxy" Apr 16 23:03:03.277943 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.277920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.280648 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.280621 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 16 23:03:03.281786 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.281760 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 16 23:03:03.285027 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.285001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kdml\" (UniqueName: \"kubernetes.io/projected/706b1917-bc2d-4a18-97bc-29c00eab72a6-kube-api-access-2kdml\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.285143 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.285035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706b1917-bc2d-4a18-97bc-29c00eab72a6-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.285143 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.285056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706b1917-bc2d-4a18-97bc-29c00eab72a6-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.285268 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.285187 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706b1917-bc2d-4a18-97bc-29c00eab72a6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.289912 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.289888 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd"] Apr 16 23:03:03.317127 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.317090 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8"] Apr 16 23:03:03.317495 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.317443 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kserve-container" containerID="cri-o://a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0" gracePeriod=30 Apr 16 23:03:03.317641 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.317484 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" containerID="cri-o://6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64" gracePeriod=30 Apr 16 23:03:03.386095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.386062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706b1917-bc2d-4a18-97bc-29c00eab72a6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.386229 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.386104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kdml\" (UniqueName: \"kubernetes.io/projected/706b1917-bc2d-4a18-97bc-29c00eab72a6-kube-api-access-2kdml\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.386229 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.386127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706b1917-bc2d-4a18-97bc-29c00eab72a6-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.386229 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.386160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706b1917-bc2d-4a18-97bc-29c00eab72a6-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.386588 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.386568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706b1917-bc2d-4a18-97bc-29c00eab72a6-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.386792 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.386768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706b1917-bc2d-4a18-97bc-29c00eab72a6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.388678 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.388656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706b1917-bc2d-4a18-97bc-29c00eab72a6-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.395187 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.395168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kdml\" (UniqueName: \"kubernetes.io/projected/706b1917-bc2d-4a18-97bc-29c00eab72a6-kube-api-access-2kdml\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-6kldd\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.590624 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.590553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:03.711257 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:03.711170 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd"] Apr 16 23:03:03.713570 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:03:03.713534 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706b1917_bc2d_4a18_97bc_29c00eab72a6.slice/crio-14fbcb45498de76f32773fd116dc2a1738ab2f4cc5ea8e66cedb59d173c0c9aa WatchSource:0}: Error finding container 14fbcb45498de76f32773fd116dc2a1738ab2f4cc5ea8e66cedb59d173c0c9aa: Status 404 returned error can't find the container with id 14fbcb45498de76f32773fd116dc2a1738ab2f4cc5ea8e66cedb59d173c0c9aa Apr 16 23:03:04.289069 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:04.289031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerStarted","Data":"45e41a6800af7fdc09888174b7009091969ebb7d6bb954b3e9e4ee831426c000"} Apr 16 23:03:04.289069 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:04.289071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerStarted","Data":"14fbcb45498de76f32773fd116dc2a1738ab2f4cc5ea8e66cedb59d173c0c9aa"} Apr 16 23:03:04.291047 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:04.291022 2576 generic.go:358] "Generic (PLEG): container finished" podID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerID="6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64" exitCode=2 Apr 16 23:03:04.291156 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:04.291048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerDied","Data":"6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64"} Apr 16 23:03:08.188959 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:08.188913 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 16 23:03:09.307497 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:09.307462 2576 generic.go:358] "Generic (PLEG): container finished" podID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerID="45e41a6800af7fdc09888174b7009091969ebb7d6bb954b3e9e4ee831426c000" exitCode=0 Apr 16 23:03:09.307939 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:09.307539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerDied","Data":"45e41a6800af7fdc09888174b7009091969ebb7d6bb954b3e9e4ee831426c000"} Apr 16 23:03:09.308845 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:09.308827 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:03:10.312687 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:10.312653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerStarted","Data":"660d1de68f456afea078378259f84ef419d5a5aa33366d449a4af8db91b49177"} Apr 16 23:03:10.313099 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:10.312692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerStarted","Data":"cbff6af6414848f6523b924b43944df4e39d5a87adbd9f66202ed6e4e8c5fe02"} Apr 16 23:03:10.313099 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:10.312881 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:10.331943 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:10.331898 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podStartSLOduration=7.331885902 podStartE2EDuration="7.331885902s" podCreationTimestamp="2026-04-16 23:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:03:10.330293385 +0000 UTC m=+2969.454917672" watchObservedRunningTime="2026-04-16 23:03:10.331885902 +0000 UTC m=+2969.456510189" Apr 16 23:03:11.315760 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:11.315727 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:11.317003 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:11.316976 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 23:03:12.319834 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:12.319784 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 23:03:13.188704 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:13.188661 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 16 23:03:17.325251 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:17.325223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:17.325733 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:17.325708 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 23:03:18.188905 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:18.188860 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 16 23:03:18.189105 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:18.189014 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:03:23.189589 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:23.189540 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 16 23:03:27.326233 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:27.326200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:03:28.189522 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:28.189475 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 16 23:03:33.188727 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:33.188684 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 16 23:03:33.951436 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:33.951415 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:03:34.032480 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.032450 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-proxy-tls\") pod \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " Apr 16 23:03:34.032660 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.032492 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " Apr 16 23:03:34.032660 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.032522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpm2n\" (UniqueName: \"kubernetes.io/projected/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kube-api-access-qpm2n\") pod \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " Apr 16 23:03:34.032660 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.032567 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kserve-provision-location\") pod \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\" (UID: \"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e\") " Apr 16 23:03:34.032902 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.032878 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" (UID: "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:03:34.034502 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.034475 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" (UID: "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:03:34.034692 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.034671 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kube-api-access-qpm2n" (OuterVolumeSpecName: "kube-api-access-qpm2n") pod "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" (UID: "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e"). InnerVolumeSpecName "kube-api-access-qpm2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:03:34.043783 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.043755 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" (UID: "7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:03:34.134029 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.133941 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpm2n\" (UniqueName: \"kubernetes.io/projected/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kube-api-access-qpm2n\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:03:34.134029 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.133970 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:03:34.134029 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.133982 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:03:34.134029 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.133992 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:03:34.388115 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.388026 2576 generic.go:358] "Generic (PLEG): container finished" podID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerID="a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0" exitCode=137 Apr 16 23:03:34.388115 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.388094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerDied","Data":"a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0"} Apr 16 23:03:34.388627 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.388132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" event={"ID":"7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e","Type":"ContainerDied","Data":"8fc0821ce7b1ba5f37159f387701fb2631206aaf02b83850899621e4a684a271"} Apr 16 23:03:34.388627 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.388151 2576 scope.go:117] "RemoveContainer" containerID="6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64" Apr 16 23:03:34.388627 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.388099 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8" Apr 16 23:03:34.396190 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.396172 2576 scope.go:117] "RemoveContainer" containerID="a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0" Apr 16 23:03:34.403067 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.403048 2576 scope.go:117] "RemoveContainer" containerID="3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655" Apr 16 23:03:34.410000 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.409939 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8"] Apr 16 23:03:34.410062 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.410026 2576 scope.go:117] "RemoveContainer" containerID="6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64" Apr 16 23:03:34.410282 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:03:34.410257 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64\": container with ID starting with 6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64 not found: ID does not exist" containerID="6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64" Apr 16 23:03:34.410368 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.410294 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64"} err="failed to get container status \"6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64\": rpc error: code = NotFound desc = could not find container \"6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64\": container with ID starting with 6585763f3cb4d80e31946beb663d49f2ebec63f6cd425043464bbfb38e71fa64 not found: ID does not exist" Apr 16 23:03:34.410368 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.410317 2576 scope.go:117] "RemoveContainer" containerID="a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0" Apr 16 23:03:34.410654 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:03:34.410626 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0\": container with ID starting with a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0 not found: ID does not exist" containerID="a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0" Apr 16 23:03:34.410754 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.410661 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0"} err="failed to get container status \"a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0\": rpc error: code = NotFound desc = could not find container \"a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0\": container with ID starting with a5bf266029777875661b648dc5cf1e6db930a3a98bfd2e187683630fabcc03f0 not found: ID does not exist" Apr 16 23:03:34.410754 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.410682 2576 scope.go:117] "RemoveContainer" containerID="3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655" Apr 16 23:03:34.410927 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:03:34.410912 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655\": container with ID starting with 3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655 not found: ID does not exist" containerID="3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655" Apr 16 23:03:34.410964 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.410930 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655"} err="failed to get container status \"3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655\": rpc error: code = NotFound desc = could not find container \"3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655\": container with ID starting with 3b1c8c4a26e96be7ef7dc8d333e0e88f6c978eafc80b1012dd77c8181f60a655 not found: ID does not exist" Apr 16 23:03:34.413505 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:34.413483 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-gwwz8"] Apr 16 23:03:35.453113 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:35.453078 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" path="/var/lib/kubelet/pods/7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e/volumes" Apr 16 23:03:44.157040 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.157007 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd"] Apr 16 23:03:44.157554 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.157438 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kserve-container" containerID="cri-o://cbff6af6414848f6523b924b43944df4e39d5a87adbd9f66202ed6e4e8c5fe02" gracePeriod=30 Apr 16 23:03:44.157639 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.157533 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" containerID="cri-o://660d1de68f456afea078378259f84ef419d5a5aa33366d449a4af8db91b49177" gracePeriod=30 Apr 16 23:03:44.242220 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242186 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq"] Apr 16 23:03:44.242531 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242518 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" Apr 16 23:03:44.242587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242532 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" Apr 16 23:03:44.242587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242542 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="storage-initializer" Apr 16 23:03:44.242587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242548 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="storage-initializer" Apr 16 23:03:44.242587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242558 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kserve-container" Apr 16 23:03:44.242587 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242563 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kserve-container" Apr 16 23:03:44.242751 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242614 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kserve-container" Apr 16 23:03:44.242751 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.242624 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ca7ef85-5e5b-48e0-bdff-b109f84e9d8e" containerName="kube-rbac-proxy" Apr 16 23:03:44.247215 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.247187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.249700 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.249676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 16 23:03:44.249700 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.249686 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 16 23:03:44.254756 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.254734 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq"] Apr 16 23:03:44.317448 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.317411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.317448 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.317449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nckl\" (UniqueName: \"kubernetes.io/projected/3d797686-0011-497c-84cc-fbd493049bd0-kube-api-access-2nckl\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.317648 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.317543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d797686-0011-497c-84cc-fbd493049bd0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.317648 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.317575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d797686-0011-497c-84cc-fbd493049bd0-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.418628 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.418544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d797686-0011-497c-84cc-fbd493049bd0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.418628 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.418580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d797686-0011-497c-84cc-fbd493049bd0-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.418628 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.418630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.418891 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.418648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nckl\" (UniqueName: \"kubernetes.io/projected/3d797686-0011-497c-84cc-fbd493049bd0-kube-api-access-2nckl\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.418891 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:03:44.418779 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 16 23:03:44.418891 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:03:44.418839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls podName:3d797686-0011-497c-84cc-fbd493049bd0 nodeName:}" failed. No retries permitted until 2026-04-16 23:03:44.918821405 +0000 UTC m=+3004.043445673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-nprnq" (UID: "3d797686-0011-497c-84cc-fbd493049bd0") : secret "isvc-triton-predictor-serving-cert" not found Apr 16 23:03:44.419095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.419075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d797686-0011-497c-84cc-fbd493049bd0-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.419201 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.419173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d797686-0011-497c-84cc-fbd493049bd0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.426643 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.426617 2576 generic.go:358] "Generic (PLEG): container finished" podID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerID="660d1de68f456afea078378259f84ef419d5a5aa33366d449a4af8db91b49177" exitCode=2 Apr 16 23:03:44.426773 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.426693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerDied","Data":"660d1de68f456afea078378259f84ef419d5a5aa33366d449a4af8db91b49177"} Apr 16 23:03:44.427032 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.427015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nckl\" (UniqueName: \"kubernetes.io/projected/3d797686-0011-497c-84cc-fbd493049bd0-kube-api-access-2nckl\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.923243 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.923207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:44.925624 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:44.925605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-nprnq\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:45.158306 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:45.158260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:03:45.278719 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:45.278697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq"] Apr 16 23:03:45.281219 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:03:45.281187 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d797686_0011_497c_84cc_fbd493049bd0.slice/crio-ba4c5a313600e6864161095828e2e843d55efcb0a000e50630511bac9a5cd769 WatchSource:0}: Error finding container ba4c5a313600e6864161095828e2e843d55efcb0a000e50630511bac9a5cd769: Status 404 returned error can't find the container with id ba4c5a313600e6864161095828e2e843d55efcb0a000e50630511bac9a5cd769 Apr 16 23:03:45.431414 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:45.431379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerStarted","Data":"a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045"} Apr 16 23:03:45.431574 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:45.431417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerStarted","Data":"ba4c5a313600e6864161095828e2e843d55efcb0a000e50630511bac9a5cd769"} Apr 16 23:03:47.320801 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:47.320754 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.60:8643/healthz\": dial tcp 10.133.0.60:8643: connect: connection refused" Apr 16 23:03:49.445421 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:49.445384 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d797686-0011-497c-84cc-fbd493049bd0" containerID="a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045" exitCode=0 Apr 16 23:03:49.445797 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:49.445456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerDied","Data":"a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045"} Apr 16 23:03:52.320136 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:52.320075 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.60:8643/healthz\": dial tcp 10.133.0.60:8643: connect: connection refused" Apr 16 23:03:57.320734 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:57.320140 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.60:8643/healthz\": dial tcp 10.133.0.60:8643: connect: connection refused" Apr 16 23:03:57.321500 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:03:57.321474 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:04:02.320787 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:02.320742 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.60:8643/healthz\": dial tcp 10.133.0.60:8643: connect: connection refused" Apr 16 23:04:07.320555 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:07.320411 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.60:8643/healthz\": dial tcp 10.133.0.60:8643: connect: connection refused" Apr 16 23:04:12.320695 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:12.320587 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.60:8643/healthz\": dial tcp 10.133.0.60:8643: connect: connection refused" Apr 16 23:04:14.557885 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.557844 2576 generic.go:358] "Generic (PLEG): container finished" podID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerID="cbff6af6414848f6523b924b43944df4e39d5a87adbd9f66202ed6e4e8c5fe02" exitCode=137 Apr 16 23:04:14.558400 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.557973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerDied","Data":"cbff6af6414848f6523b924b43944df4e39d5a87adbd9f66202ed6e4e8c5fe02"} Apr 16 23:04:14.857968 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.857941 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:04:14.894966 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.894937 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kdml\" (UniqueName: \"kubernetes.io/projected/706b1917-bc2d-4a18-97bc-29c00eab72a6-kube-api-access-2kdml\") pod \"706b1917-bc2d-4a18-97bc-29c00eab72a6\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " Apr 16 23:04:14.895143 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.895003 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706b1917-bc2d-4a18-97bc-29c00eab72a6-kserve-provision-location\") pod \"706b1917-bc2d-4a18-97bc-29c00eab72a6\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " Apr 16 23:04:14.895143 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.895059 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706b1917-bc2d-4a18-97bc-29c00eab72a6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"706b1917-bc2d-4a18-97bc-29c00eab72a6\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " Apr 16 23:04:14.895143 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.895109 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706b1917-bc2d-4a18-97bc-29c00eab72a6-proxy-tls\") pod \"706b1917-bc2d-4a18-97bc-29c00eab72a6\" (UID: \"706b1917-bc2d-4a18-97bc-29c00eab72a6\") " Apr 16 23:04:14.895594 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.895557 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706b1917-bc2d-4a18-97bc-29c00eab72a6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "706b1917-bc2d-4a18-97bc-29c00eab72a6" (UID: "706b1917-bc2d-4a18-97bc-29c00eab72a6"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:04:14.899969 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.899931 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706b1917-bc2d-4a18-97bc-29c00eab72a6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "706b1917-bc2d-4a18-97bc-29c00eab72a6" (UID: "706b1917-bc2d-4a18-97bc-29c00eab72a6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:04:14.900237 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.900197 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706b1917-bc2d-4a18-97bc-29c00eab72a6-kube-api-access-2kdml" (OuterVolumeSpecName: "kube-api-access-2kdml") pod "706b1917-bc2d-4a18-97bc-29c00eab72a6" (UID: "706b1917-bc2d-4a18-97bc-29c00eab72a6"). InnerVolumeSpecName "kube-api-access-2kdml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:04:14.905533 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.905481 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706b1917-bc2d-4a18-97bc-29c00eab72a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "706b1917-bc2d-4a18-97bc-29c00eab72a6" (UID: "706b1917-bc2d-4a18-97bc-29c00eab72a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:04:14.996499 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.996458 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kdml\" (UniqueName: \"kubernetes.io/projected/706b1917-bc2d-4a18-97bc-29c00eab72a6-kube-api-access-2kdml\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:04:14.996499 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.996498 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706b1917-bc2d-4a18-97bc-29c00eab72a6-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:04:14.996751 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.996516 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706b1917-bc2d-4a18-97bc-29c00eab72a6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:04:14.996751 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:14.996533 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706b1917-bc2d-4a18-97bc-29c00eab72a6-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:04:15.564453 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:15.564401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" event={"ID":"706b1917-bc2d-4a18-97bc-29c00eab72a6","Type":"ContainerDied","Data":"14fbcb45498de76f32773fd116dc2a1738ab2f4cc5ea8e66cedb59d173c0c9aa"} Apr 16 23:04:15.564453 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:15.564460 2576 scope.go:117] "RemoveContainer" containerID="660d1de68f456afea078378259f84ef419d5a5aa33366d449a4af8db91b49177" Apr 16 23:04:15.565095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:15.564480 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd" Apr 16 23:04:15.575741 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:15.575717 2576 scope.go:117] "RemoveContainer" containerID="cbff6af6414848f6523b924b43944df4e39d5a87adbd9f66202ed6e4e8c5fe02" Apr 16 23:04:15.583358 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:15.583308 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd"] Apr 16 23:04:15.585839 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:15.585813 2576 scope.go:117] "RemoveContainer" containerID="45e41a6800af7fdc09888174b7009091969ebb7d6bb954b3e9e4ee831426c000" Apr 16 23:04:15.589152 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:15.589103 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-6kldd"] Apr 16 23:04:17.455897 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:04:17.455862 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" path="/var/lib/kubelet/pods/706b1917-bc2d-4a18-97bc-29c00eab72a6/volumes" Apr 16 23:05:43.877147 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:43.877120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerStarted","Data":"50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d"} Apr 16 23:05:44.882415 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:44.882373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerStarted","Data":"c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b"} Apr 16 23:05:44.882839 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:44.882557 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:05:44.882839 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:44.882672 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:05:44.883885 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:44.883859 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 16 23:05:44.908404 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:44.908356 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" podStartSLOduration=6.634731715 podStartE2EDuration="2m0.908317683s" podCreationTimestamp="2026-04-16 23:03:44 +0000 UTC" firstStartedPulling="2026-04-16 23:03:49.446492152 +0000 UTC m=+3008.571116421" lastFinishedPulling="2026-04-16 23:05:43.72007811 +0000 UTC m=+3122.844702389" observedRunningTime="2026-04-16 23:05:44.907153751 +0000 UTC m=+3124.031778051" watchObservedRunningTime="2026-04-16 23:05:44.908317683 +0000 UTC m=+3124.032941973" Apr 16 23:05:45.885432 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:45.885393 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 16 23:05:50.889921 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:50.889895 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:05:50.890590 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:50.890572 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:05:55.818072 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.818037 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq"] Apr 16 23:05:55.818579 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.818414 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kserve-container" containerID="cri-o://50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d" gracePeriod=30 Apr 16 23:05:55.818579 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.818435 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kube-rbac-proxy" containerID="cri-o://c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b" gracePeriod=30 Apr 16 23:05:55.885146 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885110 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f"] Apr 16 23:05:55.885479 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885463 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="storage-initializer" Apr 16 23:05:55.885479 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885480 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="storage-initializer" Apr 16 23:05:55.885572 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885492 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" Apr 16 23:05:55.885572 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885498 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" Apr 16 23:05:55.885572 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885515 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kserve-container" Apr 16 23:05:55.885572 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885523 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kserve-container" Apr 16 23:05:55.885697 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885579 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kube-rbac-proxy" Apr 16 23:05:55.885697 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885589 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="706b1917-bc2d-4a18-97bc-29c00eab72a6" containerName="kserve-container" Apr 16 23:05:55.885972 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.885945 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.61:8643/healthz\": dial tcp 10.133.0.61:8643: connect: connection refused" Apr 16 23:05:55.905392 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.905362 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f"] Apr 16 23:05:55.905547 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.905503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:55.908065 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.908027 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 16 23:05:55.908065 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:55.908027 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 16 23:05:56.062847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.062809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99536545-0a43-4872-ba7e-81a5ff4950f4-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.062847 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.062847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z2dh\" (UniqueName: \"kubernetes.io/projected/99536545-0a43-4872-ba7e-81a5ff4950f4-kube-api-access-7z2dh\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.063077 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.062877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99536545-0a43-4872-ba7e-81a5ff4950f4-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.063077 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.062923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99536545-0a43-4872-ba7e-81a5ff4950f4-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.164119 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.164030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99536545-0a43-4872-ba7e-81a5ff4950f4-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.164119 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.164089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99536545-0a43-4872-ba7e-81a5ff4950f4-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.164380 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.164120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z2dh\" (UniqueName: \"kubernetes.io/projected/99536545-0a43-4872-ba7e-81a5ff4950f4-kube-api-access-7z2dh\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.164380 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.164163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99536545-0a43-4872-ba7e-81a5ff4950f4-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.164614 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.164594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99536545-0a43-4872-ba7e-81a5ff4950f4-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.164879 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.164858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99536545-0a43-4872-ba7e-81a5ff4950f4-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.166709 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.166680 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99536545-0a43-4872-ba7e-81a5ff4950f4-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.171884 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.171856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z2dh\" (UniqueName: \"kubernetes.io/projected/99536545-0a43-4872-ba7e-81a5ff4950f4-kube-api-access-7z2dh\") pod \"isvc-xgboost-predictor-8689c4cfcc-67n8f\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.216001 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.215964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:05:56.411666 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.411641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f"] Apr 16 23:05:56.413978 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:05:56.413949 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99536545_0a43_4872_ba7e_81a5ff4950f4.slice/crio-d89e481a3f1aef6c39bba79d9bd62bbf4e8b54753b485511fbf007e5827386d9 WatchSource:0}: Error finding container d89e481a3f1aef6c39bba79d9bd62bbf4e8b54753b485511fbf007e5827386d9: Status 404 returned error can't find the container with id d89e481a3f1aef6c39bba79d9bd62bbf4e8b54753b485511fbf007e5827386d9 Apr 16 23:05:56.918880 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.918845 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d797686-0011-497c-84cc-fbd493049bd0" containerID="c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b" exitCode=2 Apr 16 23:05:56.919427 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.918922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerDied","Data":"c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b"} Apr 16 23:05:56.920413 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.920391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerStarted","Data":"101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f"} Apr 16 23:05:56.920500 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:56.920419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerStarted","Data":"d89e481a3f1aef6c39bba79d9bd62bbf4e8b54753b485511fbf007e5827386d9"} Apr 16 23:05:58.688998 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.688975 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:05:58.788608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.788520 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls\") pod \"3d797686-0011-497c-84cc-fbd493049bd0\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " Apr 16 23:05:58.788608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.788563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nckl\" (UniqueName: \"kubernetes.io/projected/3d797686-0011-497c-84cc-fbd493049bd0-kube-api-access-2nckl\") pod \"3d797686-0011-497c-84cc-fbd493049bd0\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " Apr 16 23:05:58.788608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.788590 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d797686-0011-497c-84cc-fbd493049bd0-isvc-triton-kube-rbac-proxy-sar-config\") pod \"3d797686-0011-497c-84cc-fbd493049bd0\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " Apr 16 23:05:58.788879 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.788622 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d797686-0011-497c-84cc-fbd493049bd0-kserve-provision-location\") pod \"3d797686-0011-497c-84cc-fbd493049bd0\" (UID: \"3d797686-0011-497c-84cc-fbd493049bd0\") " Apr 16 23:05:58.788987 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.788949 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d797686-0011-497c-84cc-fbd493049bd0-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "3d797686-0011-497c-84cc-fbd493049bd0" (UID: "3d797686-0011-497c-84cc-fbd493049bd0"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:05:58.789117 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.789039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d797686-0011-497c-84cc-fbd493049bd0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3d797686-0011-497c-84cc-fbd493049bd0" (UID: "3d797686-0011-497c-84cc-fbd493049bd0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:05:58.790682 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.790656 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3d797686-0011-497c-84cc-fbd493049bd0" (UID: "3d797686-0011-497c-84cc-fbd493049bd0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:05:58.790863 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.790846 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d797686-0011-497c-84cc-fbd493049bd0-kube-api-access-2nckl" (OuterVolumeSpecName: "kube-api-access-2nckl") pod "3d797686-0011-497c-84cc-fbd493049bd0" (UID: "3d797686-0011-497c-84cc-fbd493049bd0"). InnerVolumeSpecName "kube-api-access-2nckl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:05:58.889542 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.889504 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d797686-0011-497c-84cc-fbd493049bd0-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:05:58.889542 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.889534 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nckl\" (UniqueName: \"kubernetes.io/projected/3d797686-0011-497c-84cc-fbd493049bd0-kube-api-access-2nckl\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:05:58.889542 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.889547 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d797686-0011-497c-84cc-fbd493049bd0-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:05:58.890168 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.889557 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d797686-0011-497c-84cc-fbd493049bd0-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:05:58.930973 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.930939 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d797686-0011-497c-84cc-fbd493049bd0" containerID="50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d" exitCode=0 Apr 16 23:05:58.931114 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.930977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerDied","Data":"50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d"} Apr 16 23:05:58.931114 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.931002 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" event={"ID":"3d797686-0011-497c-84cc-fbd493049bd0","Type":"ContainerDied","Data":"ba4c5a313600e6864161095828e2e843d55efcb0a000e50630511bac9a5cd769"} Apr 16 23:05:58.931114 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.931016 2576 scope.go:117] "RemoveContainer" containerID="c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b" Apr 16 23:05:58.931114 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.931015 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq" Apr 16 23:05:58.939057 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.939038 2576 scope.go:117] "RemoveContainer" containerID="50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d" Apr 16 23:05:58.946154 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.946134 2576 scope.go:117] "RemoveContainer" containerID="a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045" Apr 16 23:05:58.951069 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.951046 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq"] Apr 16 23:05:58.953628 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.953604 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nprnq"] Apr 16 23:05:58.953822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.953809 2576 scope.go:117] "RemoveContainer" containerID="c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b" Apr 16 23:05:58.954088 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:05:58.954069 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b\": container with ID starting with c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b not found: ID does not exist" containerID="c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b" Apr 16 23:05:58.954134 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.954097 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b"} err="failed to get container status \"c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b\": rpc error: code = NotFound desc = could not find container \"c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b\": container with ID starting with c9dcdd210a2ecab3727b1f9b0bc9914c4312b6587302499b91dcbc33dd5bf95b not found: ID does not exist" Apr 16 23:05:58.954134 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.954115 2576 scope.go:117] "RemoveContainer" containerID="50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d" Apr 16 23:05:58.954350 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:05:58.954334 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d\": container with ID starting with 50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d not found: ID does not exist" containerID="50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d" Apr 16 23:05:58.954401 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.954356 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d"} err="failed to get container status \"50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d\": rpc error: code = NotFound desc = could not find container \"50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d\": container with ID starting with 50f557ff443f28b177f02b2e0650a3c8963d8fafabbdcf4ead8c583c9578533d not found: ID does not exist" Apr 16 23:05:58.954401 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.954372 2576 scope.go:117] "RemoveContainer" containerID="a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045" Apr 16 23:05:58.954554 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:05:58.954536 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045\": container with ID starting with a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045 not found: ID does not exist" containerID="a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045" Apr 16 23:05:58.954609 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:58.954561 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045"} err="failed to get container status \"a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045\": rpc error: code = NotFound desc = could not find container \"a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045\": container with ID starting with a1e7525653681de9f264a1bf9d150e5f380beef9125405f825e85d202bf70045 not found: ID does not exist" Apr 16 23:05:59.453349 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:05:59.453255 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d797686-0011-497c-84cc-fbd493049bd0" path="/var/lib/kubelet/pods/3d797686-0011-497c-84cc-fbd493049bd0/volumes" Apr 16 23:06:00.939532 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:00.939501 2576 generic.go:358] "Generic (PLEG): container finished" podID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerID="101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f" exitCode=0 Apr 16 23:06:00.939910 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:00.939542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerDied","Data":"101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f"} Apr 16 23:06:21.009258 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:21.009220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerStarted","Data":"cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac"} Apr 16 23:06:21.009258 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:21.009261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerStarted","Data":"4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01"} Apr 16 23:06:21.009793 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:21.009483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:06:21.027281 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:21.027227 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podStartSLOduration=6.983915259 podStartE2EDuration="26.027210936s" podCreationTimestamp="2026-04-16 23:05:55 +0000 UTC" firstStartedPulling="2026-04-16 23:06:00.940755396 +0000 UTC m=+3140.065379662" lastFinishedPulling="2026-04-16 23:06:19.984051074 +0000 UTC m=+3159.108675339" observedRunningTime="2026-04-16 23:06:21.026380353 +0000 UTC m=+3160.151004640" watchObservedRunningTime="2026-04-16 23:06:21.027210936 +0000 UTC m=+3160.151835225" Apr 16 23:06:22.013065 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:22.013030 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:06:22.014494 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:22.014462 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:06:23.016616 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:23.016580 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:06:28.020541 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:28.020512 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:06:28.021115 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:28.021088 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:06:38.021540 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:38.021501 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:06:48.021986 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:48.021948 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:06:58.021466 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:06:58.021423 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:07:08.021574 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:08.021531 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:07:18.021906 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:18.021858 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:07:28.022439 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:28.022409 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:07:35.988010 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:35.987928 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f"] Apr 16 23:07:35.988583 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:35.988254 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" containerID="cri-o://4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01" gracePeriod=30 Apr 16 23:07:35.988583 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:35.988303 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kube-rbac-proxy" containerID="cri-o://cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac" gracePeriod=30 Apr 16 23:07:36.095305 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095269 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw"] Apr 16 23:07:36.095650 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095633 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="storage-initializer" Apr 16 23:07:36.095650 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095651 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="storage-initializer" Apr 16 23:07:36.095762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095672 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kserve-container" Apr 16 23:07:36.095762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095679 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kserve-container" Apr 16 23:07:36.095762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095693 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kube-rbac-proxy" Apr 16 23:07:36.095762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095699 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kube-rbac-proxy" Apr 16 23:07:36.095762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095752 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kube-rbac-proxy" Apr 16 23:07:36.095762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.095763 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d797686-0011-497c-84cc-fbd493049bd0" containerName="kserve-container" Apr 16 23:07:36.097952 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.097936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.100219 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.100194 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 16 23:07:36.100361 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.100253 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 16 23:07:36.106944 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.106921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw"] Apr 16 23:07:36.204107 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.204064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.204282 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.204128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.204282 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.204148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdnp\" (UniqueName: \"kubernetes.io/projected/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kube-api-access-npdnp\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.204282 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.204177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.247376 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.247270 2576 generic.go:358] "Generic (PLEG): container finished" podID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerID="cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac" exitCode=2 Apr 16 23:07:36.247376 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.247346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerDied","Data":"cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac"} Apr 16 23:07:36.304975 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.304929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.305176 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.305006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.305176 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.305034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npdnp\" (UniqueName: \"kubernetes.io/projected/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kube-api-access-npdnp\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.305176 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.305060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.305400 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:07:36.305180 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-serving-cert: secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 16 23:07:36.305400 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:07:36.305247 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls podName:3fcd6f35-bf5b-45ef-95d9-d531068b4c57 nodeName:}" failed. No retries permitted until 2026-04-16 23:07:36.805230739 +0000 UTC m=+3235.929855003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls") pod "isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" (UID: "3fcd6f35-bf5b-45ef-95d9-d531068b4c57") : secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 16 23:07:36.305527 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.305506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.305744 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.305724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.313376 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.313354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdnp\" (UniqueName: \"kubernetes.io/projected/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kube-api-access-npdnp\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.810824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.810788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:36.813288 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:36.813255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:37.009652 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:37.009623 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:37.197099 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:37.197074 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw"] Apr 16 23:07:37.199425 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:07:37.199393 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fcd6f35_bf5b_45ef_95d9_d531068b4c57.slice/crio-55c08f40fd7a7efdb8df4d616d8f566e0ed463a336589ab430cdeb80d2af4f72 WatchSource:0}: Error finding container 55c08f40fd7a7efdb8df4d616d8f566e0ed463a336589ab430cdeb80d2af4f72: Status 404 returned error can't find the container with id 55c08f40fd7a7efdb8df4d616d8f566e0ed463a336589ab430cdeb80d2af4f72 Apr 16 23:07:37.252757 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:37.252731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerStarted","Data":"55c08f40fd7a7efdb8df4d616d8f566e0ed463a336589ab430cdeb80d2af4f72"} Apr 16 23:07:38.016865 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:38.016819 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.62:8643/healthz\": dial tcp 10.133.0.62:8643: connect: connection refused" Apr 16 23:07:38.021123 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:38.021100 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 23:07:38.256706 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:38.256668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerStarted","Data":"0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2"} Apr 16 23:07:39.736044 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.736018 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:07:39.833615 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.833511 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99536545-0a43-4872-ba7e-81a5ff4950f4-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"99536545-0a43-4872-ba7e-81a5ff4950f4\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " Apr 16 23:07:39.833615 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.833555 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99536545-0a43-4872-ba7e-81a5ff4950f4-proxy-tls\") pod \"99536545-0a43-4872-ba7e-81a5ff4950f4\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " Apr 16 23:07:39.833615 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.833602 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z2dh\" (UniqueName: \"kubernetes.io/projected/99536545-0a43-4872-ba7e-81a5ff4950f4-kube-api-access-7z2dh\") pod \"99536545-0a43-4872-ba7e-81a5ff4950f4\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " Apr 16 23:07:39.833922 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.833695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99536545-0a43-4872-ba7e-81a5ff4950f4-kserve-provision-location\") pod \"99536545-0a43-4872-ba7e-81a5ff4950f4\" (UID: \"99536545-0a43-4872-ba7e-81a5ff4950f4\") " Apr 16 23:07:39.833980 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.833949 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99536545-0a43-4872-ba7e-81a5ff4950f4-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "99536545-0a43-4872-ba7e-81a5ff4950f4" (UID: "99536545-0a43-4872-ba7e-81a5ff4950f4"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:07:39.834023 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.833996 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99536545-0a43-4872-ba7e-81a5ff4950f4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99536545-0a43-4872-ba7e-81a5ff4950f4" (UID: "99536545-0a43-4872-ba7e-81a5ff4950f4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:07:39.835731 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.835705 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99536545-0a43-4872-ba7e-81a5ff4950f4-kube-api-access-7z2dh" (OuterVolumeSpecName: "kube-api-access-7z2dh") pod "99536545-0a43-4872-ba7e-81a5ff4950f4" (UID: "99536545-0a43-4872-ba7e-81a5ff4950f4"). InnerVolumeSpecName "kube-api-access-7z2dh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:07:39.835731 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.835726 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99536545-0a43-4872-ba7e-81a5ff4950f4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99536545-0a43-4872-ba7e-81a5ff4950f4" (UID: "99536545-0a43-4872-ba7e-81a5ff4950f4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:07:39.934156 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.934120 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7z2dh\" (UniqueName: \"kubernetes.io/projected/99536545-0a43-4872-ba7e-81a5ff4950f4-kube-api-access-7z2dh\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:07:39.934156 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.934150 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99536545-0a43-4872-ba7e-81a5ff4950f4-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:07:39.934156 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.934161 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99536545-0a43-4872-ba7e-81a5ff4950f4-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:07:39.934432 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:39.934171 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99536545-0a43-4872-ba7e-81a5ff4950f4-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:07:40.266902 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.266868 2576 generic.go:358] "Generic (PLEG): container finished" podID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerID="4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01" exitCode=0 Apr 16 23:07:40.267072 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.266954 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" Apr 16 23:07:40.267072 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.266950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerDied","Data":"4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01"} Apr 16 23:07:40.267072 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.266992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f" event={"ID":"99536545-0a43-4872-ba7e-81a5ff4950f4","Type":"ContainerDied","Data":"d89e481a3f1aef6c39bba79d9bd62bbf4e8b54753b485511fbf007e5827386d9"} Apr 16 23:07:40.267072 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.267007 2576 scope.go:117] "RemoveContainer" containerID="cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac" Apr 16 23:07:40.275002 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.274845 2576 scope.go:117] "RemoveContainer" containerID="4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01" Apr 16 23:07:40.282050 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.282034 2576 scope.go:117] "RemoveContainer" containerID="101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f" Apr 16 23:07:40.289515 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.289491 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f"] Apr 16 23:07:40.289742 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.289731 2576 scope.go:117] "RemoveContainer" containerID="cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac" Apr 16 23:07:40.290006 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:07:40.289986 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac\": container with ID starting with cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac not found: ID does not exist" containerID="cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac" Apr 16 23:07:40.290044 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.290016 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac"} err="failed to get container status \"cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac\": rpc error: code = NotFound desc = could not find container \"cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac\": container with ID starting with cc152befa3052dcbc3c641432357b71ac7c1ef80c0dc8290c4c872e5c9a50cac not found: ID does not exist" Apr 16 23:07:40.290044 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.290033 2576 scope.go:117] "RemoveContainer" containerID="4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01" Apr 16 23:07:40.290258 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:07:40.290242 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01\": container with ID starting with 4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01 not found: ID does not exist" containerID="4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01" Apr 16 23:07:40.290295 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.290266 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01"} err="failed to get container status \"4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01\": rpc error: code = NotFound desc = could not find container \"4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01\": container with ID starting with 4ec7623b3112ff69f515dc964acfd9cc16fcf1979ba35c9277c1aacb425a5c01 not found: ID does not exist" Apr 16 23:07:40.290295 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.290281 2576 scope.go:117] "RemoveContainer" containerID="101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f" Apr 16 23:07:40.290522 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:07:40.290506 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f\": container with ID starting with 101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f not found: ID does not exist" containerID="101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f" Apr 16 23:07:40.290592 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.290527 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f"} err="failed to get container status \"101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f\": rpc error: code = NotFound desc = could not find container \"101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f\": container with ID starting with 101c565cba45e21c0c9dd979ea7cf5f6ec09b48dbe4aed2f7a05ceffeebfaf3f not found: ID does not exist" Apr 16 23:07:40.293044 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:40.293025 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-67n8f"] Apr 16 23:07:41.273259 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:41.273223 2576 generic.go:358] "Generic (PLEG): container finished" podID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerID="0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2" exitCode=0 Apr 16 23:07:41.273711 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:41.273283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerDied","Data":"0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2"} Apr 16 23:07:41.453017 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:41.452977 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" path="/var/lib/kubelet/pods/99536545-0a43-4872-ba7e-81a5ff4950f4/volumes" Apr 16 23:07:42.278671 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:42.278635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerStarted","Data":"69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8"} Apr 16 23:07:42.278671 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:42.278671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerStarted","Data":"4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064"} Apr 16 23:07:42.279097 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:42.278886 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:42.297651 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:42.297601 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" podStartSLOduration=6.297587114 podStartE2EDuration="6.297587114s" podCreationTimestamp="2026-04-16 23:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:07:42.296447206 +0000 UTC m=+3241.421071493" watchObservedRunningTime="2026-04-16 23:07:42.297587114 +0000 UTC m=+3241.422211400" Apr 16 23:07:43.282396 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:43.282360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:07:49.290054 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:07:49.290026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:08:19.294692 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:19.294658 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:08:26.159673 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.159634 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw"] Apr 16 23:08:26.160115 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.160044 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kserve-container" containerID="cri-o://4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064" gracePeriod=30 Apr 16 23:08:26.160173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.160122 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kube-rbac-proxy" containerID="cri-o://69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8" gracePeriod=30 Apr 16 23:08:26.228874 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.228839 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9"] Apr 16 23:08:26.229235 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229220 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="storage-initializer" Apr 16 23:08:26.229280 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229238 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="storage-initializer" Apr 16 23:08:26.229280 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229247 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" Apr 16 23:08:26.229280 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229254 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" Apr 16 23:08:26.229280 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229268 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kube-rbac-proxy" Apr 16 23:08:26.229280 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229277 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kube-rbac-proxy" Apr 16 23:08:26.229454 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229368 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kserve-container" Apr 16 23:08:26.229454 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.229399 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="99536545-0a43-4872-ba7e-81a5ff4950f4" containerName="kube-rbac-proxy" Apr 16 23:08:26.232620 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.232599 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.234902 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.234875 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 16 23:08:26.235166 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.235152 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 16 23:08:26.243750 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.243726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9"] Apr 16 23:08:26.280504 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.280476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/295765e6-eaf7-4939-8890-c00f2d4d6cab-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.280647 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.280542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/295765e6-eaf7-4939-8890-c00f2d4d6cab-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.280647 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.280595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g7mb\" (UniqueName: \"kubernetes.io/projected/295765e6-eaf7-4939-8890-c00f2d4d6cab-kube-api-access-5g7mb\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.280647 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.280640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.381653 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.381622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g7mb\" (UniqueName: \"kubernetes.io/projected/295765e6-eaf7-4939-8890-c00f2d4d6cab-kube-api-access-5g7mb\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.381653 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.381658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.381908 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.381686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/295765e6-eaf7-4939-8890-c00f2d4d6cab-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.381908 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.381719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/295765e6-eaf7-4939-8890-c00f2d4d6cab-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.381908 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:08:26.381788 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-serving-cert: secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 16 23:08:26.381908 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:08:26.381865 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls podName:295765e6-eaf7-4939-8890-c00f2d4d6cab nodeName:}" failed. No retries permitted until 2026-04-16 23:08:26.881844814 +0000 UTC m=+3286.006469079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls") pod "xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" (UID: "295765e6-eaf7-4939-8890-c00f2d4d6cab") : secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 16 23:08:26.382096 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.382078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/295765e6-eaf7-4939-8890-c00f2d4d6cab-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.382418 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.382400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/295765e6-eaf7-4939-8890-c00f2d4d6cab-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.390616 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.390597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g7mb\" (UniqueName: \"kubernetes.io/projected/295765e6-eaf7-4939-8890-c00f2d4d6cab-kube-api-access-5g7mb\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.429388 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.429290 2576 generic.go:358] "Generic (PLEG): container finished" podID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerID="69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8" exitCode=2 Apr 16 23:08:26.429388 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.429354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerDied","Data":"69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8"} Apr 16 23:08:26.886303 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.886268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:26.888595 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:26.888564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-5ftr9\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:27.144377 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:27.144251 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:27.265527 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:27.265498 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9"] Apr 16 23:08:27.268015 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:08:27.267990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod295765e6_eaf7_4939_8890_c00f2d4d6cab.slice/crio-9ac465ad1b136495897681c86b9e60de271ad1b570fcc209aecc61a4d29b6986 WatchSource:0}: Error finding container 9ac465ad1b136495897681c86b9e60de271ad1b570fcc209aecc61a4d29b6986: Status 404 returned error can't find the container with id 9ac465ad1b136495897681c86b9e60de271ad1b570fcc209aecc61a4d29b6986 Apr 16 23:08:27.270360 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:27.270345 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:08:27.434638 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:27.434553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerStarted","Data":"9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238"} Apr 16 23:08:27.434638 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:27.434589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerStarted","Data":"9ac465ad1b136495897681c86b9e60de271ad1b570fcc209aecc61a4d29b6986"} Apr 16 23:08:29.286454 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:29.286407 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.63:8643/healthz\": dial tcp 10.133.0.63:8643: connect: connection refused" Apr 16 23:08:29.291351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:29.291305 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.63:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 23:08:31.450783 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:31.450741 2576 generic.go:358] "Generic (PLEG): container finished" podID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerID="9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238" exitCode=0 Apr 16 23:08:31.452658 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:31.452636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerDied","Data":"9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238"} Apr 16 23:08:32.455486 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.455451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerStarted","Data":"1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810"} Apr 16 23:08:32.455486 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.455489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerStarted","Data":"23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32"} Apr 16 23:08:32.455968 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.455717 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:32.455968 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.455849 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:08:32.475519 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.475471 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" podStartSLOduration=6.475458819 podStartE2EDuration="6.475458819s" podCreationTimestamp="2026-04-16 23:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:08:32.473994933 +0000 UTC m=+3291.598619254" watchObservedRunningTime="2026-04-16 23:08:32.475458819 +0000 UTC m=+3291.600083151" Apr 16 23:08:32.806648 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.806624 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:08:32.937097 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.937048 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " Apr 16 23:08:32.937310 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.937147 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npdnp\" (UniqueName: \"kubernetes.io/projected/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kube-api-access-npdnp\") pod \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " Apr 16 23:08:32.937310 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.937176 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kserve-provision-location\") pod \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " Apr 16 23:08:32.937310 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.937227 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls\") pod \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\" (UID: \"3fcd6f35-bf5b-45ef-95d9-d531068b4c57\") " Apr 16 23:08:32.937604 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.937573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "3fcd6f35-bf5b-45ef-95d9-d531068b4c57" (UID: "3fcd6f35-bf5b-45ef-95d9-d531068b4c57"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:08:32.944979 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.937605 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3fcd6f35-bf5b-45ef-95d9-d531068b4c57" (UID: "3fcd6f35-bf5b-45ef-95d9-d531068b4c57"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:08:32.944979 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.944454 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3fcd6f35-bf5b-45ef-95d9-d531068b4c57" (UID: "3fcd6f35-bf5b-45ef-95d9-d531068b4c57"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:08:32.944979 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:32.944453 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kube-api-access-npdnp" (OuterVolumeSpecName: "kube-api-access-npdnp") pod "3fcd6f35-bf5b-45ef-95d9-d531068b4c57" (UID: "3fcd6f35-bf5b-45ef-95d9-d531068b4c57"). InnerVolumeSpecName "kube-api-access-npdnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:08:33.037869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.037778 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:08:33.037869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.037813 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-npdnp\" (UniqueName: \"kubernetes.io/projected/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kube-api-access-npdnp\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:08:33.037869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.037824 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:08:33.037869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.037833 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcd6f35-bf5b-45ef-95d9-d531068b4c57-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:08:33.459693 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.459656 2576 generic.go:358] "Generic (PLEG): container finished" podID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerID="4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064" exitCode=0 Apr 16 23:08:33.460122 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.459787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerDied","Data":"4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064"} Apr 16 23:08:33.460122 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.459827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" event={"ID":"3fcd6f35-bf5b-45ef-95d9-d531068b4c57","Type":"ContainerDied","Data":"55c08f40fd7a7efdb8df4d616d8f566e0ed463a336589ab430cdeb80d2af4f72"} Apr 16 23:08:33.460122 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.459829 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw" Apr 16 23:08:33.460122 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.459844 2576 scope.go:117] "RemoveContainer" containerID="69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8" Apr 16 23:08:33.468025 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.468010 2576 scope.go:117] "RemoveContainer" containerID="4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064" Apr 16 23:08:33.474635 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.474617 2576 scope.go:117] "RemoveContainer" containerID="0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2" Apr 16 23:08:33.478892 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.478868 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw"] Apr 16 23:08:33.481463 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.481442 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-rrdgw"] Apr 16 23:08:33.482315 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.482297 2576 scope.go:117] "RemoveContainer" containerID="69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8" Apr 16 23:08:33.482569 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:08:33.482551 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8\": container with ID starting with 69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8 not found: ID does not exist" containerID="69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8" Apr 16 23:08:33.482635 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.482575 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8"} err="failed to get container status \"69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8\": rpc error: code = NotFound desc = could not find container \"69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8\": container with ID starting with 69386cb0320c6c6561a6b2e180070165c73307a7bbb5111f49f31b4039d3a4f8 not found: ID does not exist" Apr 16 23:08:33.482635 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.482596 2576 scope.go:117] "RemoveContainer" containerID="4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064" Apr 16 23:08:33.482830 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:08:33.482811 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064\": container with ID starting with 4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064 not found: ID does not exist" containerID="4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064" Apr 16 23:08:33.482872 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.482839 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064"} err="failed to get container status \"4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064\": rpc error: code = NotFound desc = could not find container \"4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064\": container with ID starting with 4431a93491925872bcfa38899da66b3de7dd68fec0bfd0e00babc99b43434064 not found: ID does not exist" Apr 16 23:08:33.482872 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.482856 2576 scope.go:117] "RemoveContainer" containerID="0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2" Apr 16 23:08:33.483074 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:08:33.483059 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2\": container with ID starting with 0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2 not found: ID does not exist" containerID="0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2" Apr 16 23:08:33.483115 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:33.483077 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2"} err="failed to get container status \"0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2\": rpc error: code = NotFound desc = could not find container \"0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2\": container with ID starting with 0311579a5263427303658701e2fc0da3c8080bab1e64b1c5e02a65d569b4dfc2 not found: ID does not exist" Apr 16 23:08:35.453559 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:35.453526 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" path="/var/lib/kubelet/pods/3fcd6f35-bf5b-45ef-95d9-d531068b4c57/volumes" Apr 16 23:08:38.465845 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:08:38.465815 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:09:08.470262 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:08.470234 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:09:16.365608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365569 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9"] Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365894 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kserve-container" Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365905 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kserve-container" Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365916 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kube-rbac-proxy" Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365922 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kube-rbac-proxy" Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365933 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="storage-initializer" Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365939 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="storage-initializer" Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365980 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kserve-container" Apr 16 23:09:16.367899 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.365987 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fcd6f35-bf5b-45ef-95d9-d531068b4c57" containerName="kube-rbac-proxy" Apr 16 23:09:16.368957 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.368941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.372404 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.372372 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 16 23:09:16.372538 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.372421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 16 23:09:16.377388 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.377363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.377568 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.377549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlk4m\" (UniqueName: \"kubernetes.io/projected/48fcafd2-d431-4506-bf2e-7c7390734d65-kube-api-access-zlk4m\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.377698 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.377676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48fcafd2-d431-4506-bf2e-7c7390734d65-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.377762 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.377749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48fcafd2-d431-4506-bf2e-7c7390734d65-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.382165 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.382132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9"] Apr 16 23:09:16.415212 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.415182 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9"] Apr 16 23:09:16.415559 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.415515 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kserve-container" containerID="cri-o://23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32" gracePeriod=30 Apr 16 23:09:16.415692 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.415568 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kube-rbac-proxy" containerID="cri-o://1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810" gracePeriod=30 Apr 16 23:09:16.478570 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.478524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48fcafd2-d431-4506-bf2e-7c7390734d65-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.478748 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.478600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48fcafd2-d431-4506-bf2e-7c7390734d65-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.478748 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.478650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.478748 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.478674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlk4m\" (UniqueName: \"kubernetes.io/projected/48fcafd2-d431-4506-bf2e-7c7390734d65-kube-api-access-zlk4m\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.478937 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:09:16.478807 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 16 23:09:16.478937 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:09:16.478874 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls podName:48fcafd2-d431-4506-bf2e-7c7390734d65 nodeName:}" failed. No retries permitted until 2026-04-16 23:09:16.978860021 +0000 UTC m=+3336.103484291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-z2km9" (UID: "48fcafd2-d431-4506-bf2e-7c7390734d65") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 16 23:09:16.478937 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.478886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48fcafd2-d431-4506-bf2e-7c7390734d65-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.479185 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.479166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48fcafd2-d431-4506-bf2e-7c7390734d65-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.488173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.488151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlk4m\" (UniqueName: \"kubernetes.io/projected/48fcafd2-d431-4506-bf2e-7c7390734d65-kube-api-access-zlk4m\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.589838 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.589807 2576 generic.go:358] "Generic (PLEG): container finished" podID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerID="1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810" exitCode=2 Apr 16 23:09:16.590000 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.589844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerDied","Data":"1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810"} Apr 16 23:09:16.982863 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.982824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:16.985351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:16.985299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-z2km9\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:17.280984 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:17.280895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:17.402605 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:17.402579 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9"] Apr 16 23:09:17.405281 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:09:17.405253 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fcafd2_d431_4506_bf2e_7c7390734d65.slice/crio-41ba06006738cdaa3e93222e8d82d1852275810462273a1972531166dec89224 WatchSource:0}: Error finding container 41ba06006738cdaa3e93222e8d82d1852275810462273a1972531166dec89224: Status 404 returned error can't find the container with id 41ba06006738cdaa3e93222e8d82d1852275810462273a1972531166dec89224 Apr 16 23:09:17.594423 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:17.594310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerStarted","Data":"276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084"} Apr 16 23:09:17.594423 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:17.594370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerStarted","Data":"41ba06006738cdaa3e93222e8d82d1852275810462273a1972531166dec89224"} Apr 16 23:09:18.460738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:18.460699 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.64:8643/healthz\": dial tcp 10.133.0.64:8643: connect: connection refused" Apr 16 23:09:21.607525 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:21.607491 2576 generic.go:358] "Generic (PLEG): container finished" podID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerID="276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084" exitCode=0 Apr 16 23:09:21.607998 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:21.607567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerDied","Data":"276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084"} Apr 16 23:09:22.615556 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:22.615523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerStarted","Data":"4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7"} Apr 16 23:09:22.615999 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:22.615565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerStarted","Data":"9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9"} Apr 16 23:09:22.615999 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:22.615854 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:22.615999 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:22.615878 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:22.617242 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:22.617214 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:09:22.636321 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:22.636267 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podStartSLOduration=6.636251587 podStartE2EDuration="6.636251587s" podCreationTimestamp="2026-04-16 23:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:09:22.634665795 +0000 UTC m=+3341.759290084" watchObservedRunningTime="2026-04-16 23:09:22.636251587 +0000 UTC m=+3341.760875875" Apr 16 23:09:23.058342 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.058307 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:09:23.130718 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.130683 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls\") pod \"295765e6-eaf7-4939-8890-c00f2d4d6cab\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " Apr 16 23:09:23.130718 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.130723 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/295765e6-eaf7-4939-8890-c00f2d4d6cab-kserve-provision-location\") pod \"295765e6-eaf7-4939-8890-c00f2d4d6cab\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " Apr 16 23:09:23.130983 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.130752 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g7mb\" (UniqueName: \"kubernetes.io/projected/295765e6-eaf7-4939-8890-c00f2d4d6cab-kube-api-access-5g7mb\") pod \"295765e6-eaf7-4939-8890-c00f2d4d6cab\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " Apr 16 23:09:23.130983 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.130832 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/295765e6-eaf7-4939-8890-c00f2d4d6cab-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"295765e6-eaf7-4939-8890-c00f2d4d6cab\" (UID: \"295765e6-eaf7-4939-8890-c00f2d4d6cab\") " Apr 16 23:09:23.131120 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.131087 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295765e6-eaf7-4939-8890-c00f2d4d6cab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "295765e6-eaf7-4939-8890-c00f2d4d6cab" (UID: "295765e6-eaf7-4939-8890-c00f2d4d6cab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:09:23.131170 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.131096 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295765e6-eaf7-4939-8890-c00f2d4d6cab-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "295765e6-eaf7-4939-8890-c00f2d4d6cab" (UID: "295765e6-eaf7-4939-8890-c00f2d4d6cab"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:09:23.132925 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.132903 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295765e6-eaf7-4939-8890-c00f2d4d6cab-kube-api-access-5g7mb" (OuterVolumeSpecName: "kube-api-access-5g7mb") pod "295765e6-eaf7-4939-8890-c00f2d4d6cab" (UID: "295765e6-eaf7-4939-8890-c00f2d4d6cab"). InnerVolumeSpecName "kube-api-access-5g7mb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:09:23.133017 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.132933 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "295765e6-eaf7-4939-8890-c00f2d4d6cab" (UID: "295765e6-eaf7-4939-8890-c00f2d4d6cab"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:09:23.232055 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.232029 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/295765e6-eaf7-4939-8890-c00f2d4d6cab-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:09:23.232055 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.232054 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/295765e6-eaf7-4939-8890-c00f2d4d6cab-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:09:23.232226 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.232064 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5g7mb\" (UniqueName: \"kubernetes.io/projected/295765e6-eaf7-4939-8890-c00f2d4d6cab-kube-api-access-5g7mb\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:09:23.232226 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.232076 2576 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/295765e6-eaf7-4939-8890-c00f2d4d6cab-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:09:23.620163 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.620064 2576 generic.go:358] "Generic (PLEG): container finished" podID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerID="23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32" exitCode=0 Apr 16 23:09:23.620163 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.620157 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" Apr 16 23:09:23.620717 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.620154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerDied","Data":"23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32"} Apr 16 23:09:23.620717 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.620266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9" event={"ID":"295765e6-eaf7-4939-8890-c00f2d4d6cab","Type":"ContainerDied","Data":"9ac465ad1b136495897681c86b9e60de271ad1b570fcc209aecc61a4d29b6986"} Apr 16 23:09:23.620717 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.620282 2576 scope.go:117] "RemoveContainer" containerID="1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810" Apr 16 23:09:23.620901 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.620871 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:09:23.628041 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.628024 2576 scope.go:117] "RemoveContainer" containerID="23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32" Apr 16 23:09:23.634948 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.634931 2576 scope.go:117] "RemoveContainer" containerID="9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238" Apr 16 23:09:23.641999 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.641973 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9"] Apr 16 23:09:23.647715 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.647692 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-5ftr9"] Apr 16 23:09:23.648825 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.648808 2576 scope.go:117] "RemoveContainer" containerID="1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810" Apr 16 23:09:23.649110 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:09:23.649085 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810\": container with ID starting with 1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810 not found: ID does not exist" containerID="1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810" Apr 16 23:09:23.649219 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.649122 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810"} err="failed to get container status \"1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810\": rpc error: code = NotFound desc = could not find container \"1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810\": container with ID starting with 1982b82a4b83e048985656fea5abc6f634a0914f6b7596efad669918d6227810 not found: ID does not exist" Apr 16 23:09:23.649219 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.649152 2576 scope.go:117] "RemoveContainer" containerID="23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32" Apr 16 23:09:23.649405 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:09:23.649383 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32\": container with ID starting with 23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32 not found: ID does not exist" containerID="23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32" Apr 16 23:09:23.649482 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.649416 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32"} err="failed to get container status \"23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32\": rpc error: code = NotFound desc = could not find container \"23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32\": container with ID starting with 23b783016416db296205f489cbd3f6863e4b35880e58a585923431c86877ad32 not found: ID does not exist" Apr 16 23:09:23.649482 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.649437 2576 scope.go:117] "RemoveContainer" containerID="9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238" Apr 16 23:09:23.649639 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:09:23.649622 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238\": container with ID starting with 9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238 not found: ID does not exist" containerID="9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238" Apr 16 23:09:23.649694 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:23.649645 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238"} err="failed to get container status \"9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238\": rpc error: code = NotFound desc = could not find container \"9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238\": container with ID starting with 9c9df1a0731c115f6e97d12621fe917022fdaca8f3511e86f9c9950d70c16238 not found: ID does not exist" Apr 16 23:09:25.452968 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:25.452932 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" path="/var/lib/kubelet/pods/295765e6-eaf7-4939-8890-c00f2d4d6cab/volumes" Apr 16 23:09:28.624193 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:28.624165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:09:28.624781 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:28.624755 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:09:38.624682 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:38.624641 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:09:48.625414 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:48.625373 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:09:58.624708 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:09:58.624665 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:10:08.625044 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:08.625003 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:10:18.625035 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:18.624992 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:10:28.626086 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:28.626055 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:10:36.469345 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.469244 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9"] Apr 16 23:10:36.469813 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.469631 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" containerID="cri-o://9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9" gracePeriod=30 Apr 16 23:10:36.469813 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.469661 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kube-rbac-proxy" containerID="cri-o://4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7" gracePeriod=30 Apr 16 23:10:36.545408 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545365 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s"] Apr 16 23:10:36.545768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545749 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kserve-container" Apr 16 23:10:36.545888 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545770 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kserve-container" Apr 16 23:10:36.545888 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545785 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="storage-initializer" Apr 16 23:10:36.545888 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545794 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="storage-initializer" Apr 16 23:10:36.545888 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545804 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kube-rbac-proxy" Apr 16 23:10:36.545888 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545812 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kube-rbac-proxy" Apr 16 23:10:36.546155 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545891 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kube-rbac-proxy" Apr 16 23:10:36.546155 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.545907 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="295765e6-eaf7-4939-8890-c00f2d4d6cab" containerName="kserve-container" Apr 16 23:10:36.549421 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.549395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.551693 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.551673 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 16 23:10:36.551693 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.551681 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 16 23:10:36.559525 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.559502 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s"] Apr 16 23:10:36.608072 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.608029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d771fcdd-a2f6-432c-bee9-7964548b8f41-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.608235 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.608078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d771fcdd-a2f6-432c-bee9-7964548b8f41-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.608235 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.608183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.608235 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.608222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhpb4\" (UniqueName: \"kubernetes.io/projected/d771fcdd-a2f6-432c-bee9-7964548b8f41-kube-api-access-fhpb4\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.709194 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.709153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.709194 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.709196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhpb4\" (UniqueName: \"kubernetes.io/projected/d771fcdd-a2f6-432c-bee9-7964548b8f41-kube-api-access-fhpb4\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.709476 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:10:36.709349 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-serving-cert: secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 16 23:10:36.709476 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:10:36.709428 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls podName:d771fcdd-a2f6-432c-bee9-7964548b8f41 nodeName:}" failed. No retries permitted until 2026-04-16 23:10:37.209406774 +0000 UTC m=+3416.334031043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls") pod "isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" (UID: "d771fcdd-a2f6-432c-bee9-7964548b8f41") : secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 16 23:10:36.709476 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.709348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d771fcdd-a2f6-432c-bee9-7964548b8f41-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.709595 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.709509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d771fcdd-a2f6-432c-bee9-7964548b8f41-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.709804 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.709786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d771fcdd-a2f6-432c-bee9-7964548b8f41-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.710090 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.710071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d771fcdd-a2f6-432c-bee9-7964548b8f41-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.718170 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.718146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhpb4\" (UniqueName: \"kubernetes.io/projected/d771fcdd-a2f6-432c-bee9-7964548b8f41-kube-api-access-fhpb4\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:36.852876 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.852767 2576 generic.go:358] "Generic (PLEG): container finished" podID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerID="4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7" exitCode=2 Apr 16 23:10:36.852876 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:36.852832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerDied","Data":"4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7"} Apr 16 23:10:37.212746 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:37.212712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:37.215110 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:37.215081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:37.460663 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:37.460631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:37.582697 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:37.582655 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s"] Apr 16 23:10:37.586837 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:10:37.586805 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd771fcdd_a2f6_432c_bee9_7964548b8f41.slice/crio-356a4fe10f40651416360ada3144faf31f838deaf8051844c3fc611f68088a26 WatchSource:0}: Error finding container 356a4fe10f40651416360ada3144faf31f838deaf8051844c3fc611f68088a26: Status 404 returned error can't find the container with id 356a4fe10f40651416360ada3144faf31f838deaf8051844c3fc611f68088a26 Apr 16 23:10:37.858109 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:37.858010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerStarted","Data":"50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb"} Apr 16 23:10:37.858109 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:37.858047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerStarted","Data":"356a4fe10f40651416360ada3144faf31f838deaf8051844c3fc611f68088a26"} Apr 16 23:10:38.620558 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:38.620522 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.65:8643/healthz\": dial tcp 10.133.0.65:8643: connect: connection refused" Apr 16 23:10:38.624804 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:38.624783 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 23:10:40.017441 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.017413 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:10:40.137869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.137778 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls\") pod \"48fcafd2-d431-4506-bf2e-7c7390734d65\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " Apr 16 23:10:40.137869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.137849 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48fcafd2-d431-4506-bf2e-7c7390734d65-kserve-provision-location\") pod \"48fcafd2-d431-4506-bf2e-7c7390734d65\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " Apr 16 23:10:40.138107 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.137871 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48fcafd2-d431-4506-bf2e-7c7390734d65-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"48fcafd2-d431-4506-bf2e-7c7390734d65\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " Apr 16 23:10:40.138107 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.137908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlk4m\" (UniqueName: \"kubernetes.io/projected/48fcafd2-d431-4506-bf2e-7c7390734d65-kube-api-access-zlk4m\") pod \"48fcafd2-d431-4506-bf2e-7c7390734d65\" (UID: \"48fcafd2-d431-4506-bf2e-7c7390734d65\") " Apr 16 23:10:40.138227 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.138207 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fcafd2-d431-4506-bf2e-7c7390734d65-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "48fcafd2-d431-4506-bf2e-7c7390734d65" (UID: "48fcafd2-d431-4506-bf2e-7c7390734d65"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:10:40.138272 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.138204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48fcafd2-d431-4506-bf2e-7c7390734d65-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "48fcafd2-d431-4506-bf2e-7c7390734d65" (UID: "48fcafd2-d431-4506-bf2e-7c7390734d65"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:10:40.140095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.140074 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "48fcafd2-d431-4506-bf2e-7c7390734d65" (UID: "48fcafd2-d431-4506-bf2e-7c7390734d65"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:10:40.140174 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.140119 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fcafd2-d431-4506-bf2e-7c7390734d65-kube-api-access-zlk4m" (OuterVolumeSpecName: "kube-api-access-zlk4m") pod "48fcafd2-d431-4506-bf2e-7c7390734d65" (UID: "48fcafd2-d431-4506-bf2e-7c7390734d65"). InnerVolumeSpecName "kube-api-access-zlk4m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:10:40.238573 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.238540 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fcafd2-d431-4506-bf2e-7c7390734d65-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:10:40.238573 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.238568 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48fcafd2-d431-4506-bf2e-7c7390734d65-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:10:40.238573 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.238578 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/48fcafd2-d431-4506-bf2e-7c7390734d65-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:10:40.238792 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.238589 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zlk4m\" (UniqueName: \"kubernetes.io/projected/48fcafd2-d431-4506-bf2e-7c7390734d65-kube-api-access-zlk4m\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:10:40.869699 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.869665 2576 generic.go:358] "Generic (PLEG): container finished" podID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerID="9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9" exitCode=0 Apr 16 23:10:40.869912 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.869744 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" Apr 16 23:10:40.869912 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.869752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerDied","Data":"9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9"} Apr 16 23:10:40.869912 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.869799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9" event={"ID":"48fcafd2-d431-4506-bf2e-7c7390734d65","Type":"ContainerDied","Data":"41ba06006738cdaa3e93222e8d82d1852275810462273a1972531166dec89224"} Apr 16 23:10:40.869912 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.869822 2576 scope.go:117] "RemoveContainer" containerID="4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7" Apr 16 23:10:40.879992 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.879974 2576 scope.go:117] "RemoveContainer" containerID="9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9" Apr 16 23:10:40.886975 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.886843 2576 scope.go:117] "RemoveContainer" containerID="276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084" Apr 16 23:10:40.893089 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.893064 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9"] Apr 16 23:10:40.893571 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.893558 2576 scope.go:117] "RemoveContainer" containerID="4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7" Apr 16 23:10:40.893798 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:10:40.893783 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7\": container with ID starting with 4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7 not found: ID does not exist" containerID="4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7" Apr 16 23:10:40.893837 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.893809 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7"} err="failed to get container status \"4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7\": rpc error: code = NotFound desc = could not find container \"4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7\": container with ID starting with 4b0a7a4315cdeeb7145b5ad6a3d8598163be60951344ac43be45fd269c6721b7 not found: ID does not exist" Apr 16 23:10:40.893837 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.893833 2576 scope.go:117] "RemoveContainer" containerID="9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9" Apr 16 23:10:40.894050 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:10:40.894032 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9\": container with ID starting with 9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9 not found: ID does not exist" containerID="9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9" Apr 16 23:10:40.894346 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.894055 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9"} err="failed to get container status \"9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9\": rpc error: code = NotFound desc = could not find container \"9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9\": container with ID starting with 9756999f32d26de16d52300ca5c9c7dd4e9585e73ff7dfcfb14b9e283e7a5eb9 not found: ID does not exist" Apr 16 23:10:40.894346 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.894075 2576 scope.go:117] "RemoveContainer" containerID="276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084" Apr 16 23:10:40.894492 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:10:40.894370 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084\": container with ID starting with 276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084 not found: ID does not exist" containerID="276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084" Apr 16 23:10:40.894492 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.894397 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084"} err="failed to get container status \"276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084\": rpc error: code = NotFound desc = could not find container \"276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084\": container with ID starting with 276b4cc56410b8b1fb993d4fe44ffa15445300c33f3158ff3fb8c480afe92084 not found: ID does not exist" Apr 16 23:10:40.895930 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:40.895912 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-z2km9"] Apr 16 23:10:41.454452 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:41.454414 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" path="/var/lib/kubelet/pods/48fcafd2-d431-4506-bf2e-7c7390734d65/volumes" Apr 16 23:10:41.874819 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:41.874785 2576 generic.go:358] "Generic (PLEG): container finished" podID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerID="50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb" exitCode=0 Apr 16 23:10:41.875031 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:41.874853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerDied","Data":"50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb"} Apr 16 23:10:42.880272 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:42.880237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerStarted","Data":"cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e"} Apr 16 23:10:42.880272 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:42.880278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerStarted","Data":"b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3"} Apr 16 23:10:42.880752 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:42.880521 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:42.880752 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:42.880589 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:10:42.898533 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:42.898494 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" podStartSLOduration=6.898480768 podStartE2EDuration="6.898480768s" podCreationTimestamp="2026-04-16 23:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:10:42.89708606 +0000 UTC m=+3422.021710348" watchObservedRunningTime="2026-04-16 23:10:42.898480768 +0000 UTC m=+3422.023105054" Apr 16 23:10:48.890227 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:10:48.890194 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:11:18.937192 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:18.937147 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 23:11:28.893092 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:28.893060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:11:36.741360 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741173 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh"] Apr 16 23:11:36.741822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741773 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="storage-initializer" Apr 16 23:11:36.741822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741792 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="storage-initializer" Apr 16 23:11:36.741822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741819 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kube-rbac-proxy" Apr 16 23:11:36.742033 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741829 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kube-rbac-proxy" Apr 16 23:11:36.742033 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741845 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" Apr 16 23:11:36.742033 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741854 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" Apr 16 23:11:36.742033 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741946 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kserve-container" Apr 16 23:11:36.742033 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.741964 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="48fcafd2-d431-4506-bf2e-7c7390734d65" containerName="kube-rbac-proxy" Apr 16 23:11:36.745133 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.745110 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.747440 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.747420 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 16 23:11:36.747557 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.747446 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 16 23:11:36.754912 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.754888 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh"] Apr 16 23:11:36.764051 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.764026 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s"] Apr 16 23:11:36.764416 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.764317 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kserve-container" containerID="cri-o://b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3" gracePeriod=30 Apr 16 23:11:36.764513 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.764411 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kube-rbac-proxy" containerID="cri-o://cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e" gracePeriod=30 Apr 16 23:11:36.793237 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.793203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.793424 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.793249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.793424 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.793376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89jwb\" (UniqueName: \"kubernetes.io/projected/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kube-api-access-89jwb\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.793424 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.793417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.894455 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.894426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.894590 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.894472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.894590 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.894517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89jwb\" (UniqueName: \"kubernetes.io/projected/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kube-api-access-89jwb\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.894590 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.894544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.894726 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:11:36.894653 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 16 23:11:36.894726 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:11:36.894707 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls podName:49a9a0a8-4a30-44b1-abb7-eeadffcc2922 nodeName:}" failed. No retries permitted until 2026-04-16 23:11:37.394691951 +0000 UTC m=+3476.519316216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" (UID: "49a9a0a8-4a30-44b1-abb7-eeadffcc2922") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 16 23:11:36.894901 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.894882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.895100 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.895081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:36.902540 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:36.902520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89jwb\" (UniqueName: \"kubernetes.io/projected/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kube-api-access-89jwb\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:37.046462 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:37.046375 2576 generic.go:358] "Generic (PLEG): container finished" podID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerID="cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e" exitCode=2 Apr 16 23:11:37.046462 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:37.046415 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerDied","Data":"cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e"} Apr 16 23:11:37.398813 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:37.398706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:37.401102 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:37.401076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-xctlh\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:37.656821 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:37.656720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:37.779472 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:37.779307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh"] Apr 16 23:11:37.782505 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:11:37.782474 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a9a0a8_4a30_44b1_abb7_eeadffcc2922.slice/crio-ab881f38eca7961598545eb11d8b34fd51819315dfa6a257afe0935d714c95d4 WatchSource:0}: Error finding container ab881f38eca7961598545eb11d8b34fd51819315dfa6a257afe0935d714c95d4: Status 404 returned error can't find the container with id ab881f38eca7961598545eb11d8b34fd51819315dfa6a257afe0935d714c95d4 Apr 16 23:11:38.051131 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:38.051096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerStarted","Data":"d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73"} Apr 16 23:11:38.051131 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:38.051133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerStarted","Data":"ab881f38eca7961598545eb11d8b34fd51819315dfa6a257afe0935d714c95d4"} Apr 16 23:11:38.885784 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:38.885736 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.66:8643/healthz\": dial tcp 10.133.0.66:8643: connect: connection refused" Apr 16 23:11:39.932516 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:39.932469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.66:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 16 23:11:42.065147 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:42.065110 2576 generic.go:358] "Generic (PLEG): container finished" podID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerID="d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73" exitCode=0 Apr 16 23:11:42.065605 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:42.065174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerDied","Data":"d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73"} Apr 16 23:11:43.069976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:43.069943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerStarted","Data":"4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a"} Apr 16 23:11:43.069976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:43.069979 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerStarted","Data":"e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331"} Apr 16 23:11:43.070416 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:43.070166 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:43.089409 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:43.089361 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podStartSLOduration=7.089339991 podStartE2EDuration="7.089339991s" podCreationTimestamp="2026-04-16 23:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:11:43.088962104 +0000 UTC m=+3482.213586391" watchObservedRunningTime="2026-04-16 23:11:43.089339991 +0000 UTC m=+3482.213964276" Apr 16 23:11:43.885030 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:43.884982 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.66:8643/healthz\": dial tcp 10.133.0.66:8643: connect: connection refused" Apr 16 23:11:44.073723 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.073686 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:44.074978 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.074951 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:11:44.510729 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.510708 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:11:44.560739 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.560714 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhpb4\" (UniqueName: \"kubernetes.io/projected/d771fcdd-a2f6-432c-bee9-7964548b8f41-kube-api-access-fhpb4\") pod \"d771fcdd-a2f6-432c-bee9-7964548b8f41\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " Apr 16 23:11:44.560897 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.560763 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls\") pod \"d771fcdd-a2f6-432c-bee9-7964548b8f41\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " Apr 16 23:11:44.560897 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.560792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d771fcdd-a2f6-432c-bee9-7964548b8f41-kserve-provision-location\") pod \"d771fcdd-a2f6-432c-bee9-7964548b8f41\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " Apr 16 23:11:44.560897 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.560826 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d771fcdd-a2f6-432c-bee9-7964548b8f41-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"d771fcdd-a2f6-432c-bee9-7964548b8f41\" (UID: \"d771fcdd-a2f6-432c-bee9-7964548b8f41\") " Apr 16 23:11:44.561208 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.561175 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d771fcdd-a2f6-432c-bee9-7964548b8f41-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d771fcdd-a2f6-432c-bee9-7964548b8f41" (UID: "d771fcdd-a2f6-432c-bee9-7964548b8f41"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:11:44.561345 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.561219 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d771fcdd-a2f6-432c-bee9-7964548b8f41-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "d771fcdd-a2f6-432c-bee9-7964548b8f41" (UID: "d771fcdd-a2f6-432c-bee9-7964548b8f41"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:11:44.562853 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.562832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d771fcdd-a2f6-432c-bee9-7964548b8f41" (UID: "d771fcdd-a2f6-432c-bee9-7964548b8f41"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:11:44.562853 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.562842 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d771fcdd-a2f6-432c-bee9-7964548b8f41-kube-api-access-fhpb4" (OuterVolumeSpecName: "kube-api-access-fhpb4") pod "d771fcdd-a2f6-432c-bee9-7964548b8f41" (UID: "d771fcdd-a2f6-432c-bee9-7964548b8f41"). InnerVolumeSpecName "kube-api-access-fhpb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:11:44.661608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.661514 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhpb4\" (UniqueName: \"kubernetes.io/projected/d771fcdd-a2f6-432c-bee9-7964548b8f41-kube-api-access-fhpb4\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:11:44.661608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.661547 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d771fcdd-a2f6-432c-bee9-7964548b8f41-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:11:44.661608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.661557 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d771fcdd-a2f6-432c-bee9-7964548b8f41-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:11:44.661608 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:44.661567 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d771fcdd-a2f6-432c-bee9-7964548b8f41-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:11:45.077868 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.077830 2576 generic.go:358] "Generic (PLEG): container finished" podID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerID="b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3" exitCode=0 Apr 16 23:11:45.078317 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.077921 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" Apr 16 23:11:45.078317 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.077920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerDied","Data":"b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3"} Apr 16 23:11:45.078317 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.077977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s" event={"ID":"d771fcdd-a2f6-432c-bee9-7964548b8f41","Type":"ContainerDied","Data":"356a4fe10f40651416360ada3144faf31f838deaf8051844c3fc611f68088a26"} Apr 16 23:11:45.078317 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.077998 2576 scope.go:117] "RemoveContainer" containerID="cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e" Apr 16 23:11:45.078707 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.078675 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:11:45.086295 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.086275 2576 scope.go:117] "RemoveContainer" containerID="b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3" Apr 16 23:11:45.093651 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.093634 2576 scope.go:117] "RemoveContainer" containerID="50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb" Apr 16 23:11:45.099037 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.099012 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s"] Apr 16 23:11:45.100412 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.100392 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-xhb2s"] Apr 16 23:11:45.101283 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.101264 2576 scope.go:117] "RemoveContainer" containerID="cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e" Apr 16 23:11:45.101551 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:11:45.101535 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e\": container with ID starting with cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e not found: ID does not exist" containerID="cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e" Apr 16 23:11:45.101609 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.101561 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e"} err="failed to get container status \"cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e\": rpc error: code = NotFound desc = could not find container \"cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e\": container with ID starting with cd8813c2b1ec3b4ae46bd3830a555df6b2559a0a74a80c07377b4b10b1e13f8e not found: ID does not exist" Apr 16 23:11:45.101609 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.101577 2576 scope.go:117] "RemoveContainer" containerID="b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3" Apr 16 23:11:45.101776 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:11:45.101760 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3\": container with ID starting with b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3 not found: ID does not exist" containerID="b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3" Apr 16 23:11:45.101822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.101779 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3"} err="failed to get container status \"b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3\": rpc error: code = NotFound desc = could not find container \"b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3\": container with ID starting with b8599275eefe3c4bc34c1a55230ba945c7af54c89a8638fcc19e74dafad3d5e3 not found: ID does not exist" Apr 16 23:11:45.101822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.101802 2576 scope.go:117] "RemoveContainer" containerID="50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb" Apr 16 23:11:45.102000 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:11:45.101985 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb\": container with ID starting with 50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb not found: ID does not exist" containerID="50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb" Apr 16 23:11:45.102045 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.102003 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb"} err="failed to get container status \"50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb\": rpc error: code = NotFound desc = could not find container \"50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb\": container with ID starting with 50b26c5ba1e81d72657a9c2f3fe562f8642e23a8f3d4a7f7a0c2e856273bd8bb not found: ID does not exist" Apr 16 23:11:45.453304 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:45.453267 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" path="/var/lib/kubelet/pods/d771fcdd-a2f6-432c-bee9-7964548b8f41/volumes" Apr 16 23:11:50.083271 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:50.083243 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:11:50.083783 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:11:50.083757 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:12:00.084378 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:00.084265 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:12:10.084591 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:10.084545 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:12:20.084252 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:20.084210 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:12:30.084564 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:30.084523 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:12:40.084463 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:40.084407 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:12:50.085171 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:50.085136 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:12:56.831386 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.831338 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh"] Apr 16 23:12:56.831858 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.831624 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" containerID="cri-o://e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331" gracePeriod=30 Apr 16 23:12:56.831858 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.831683 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kube-rbac-proxy" containerID="cri-o://4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a" gracePeriod=30 Apr 16 23:12:56.908055 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908018 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp"] Apr 16 23:12:56.908421 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908401 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="storage-initializer" Apr 16 23:12:56.908503 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908424 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="storage-initializer" Apr 16 23:12:56.908503 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908472 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kserve-container" Apr 16 23:12:56.908503 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908481 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kserve-container" Apr 16 23:12:56.908503 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908499 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kube-rbac-proxy" Apr 16 23:12:56.908636 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908506 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kube-rbac-proxy" Apr 16 23:12:56.908636 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908567 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kube-rbac-proxy" Apr 16 23:12:56.908636 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.908577 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d771fcdd-a2f6-432c-bee9-7964548b8f41" containerName="kserve-container" Apr 16 23:12:56.917009 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.916983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:56.919546 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.919520 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 16 23:12:56.919702 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.919603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 23:12:56.919772 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.919756 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 16 23:12:56.920528 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:56.920501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp"] Apr 16 23:12:57.035684 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.035646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89wzz\" (UniqueName: \"kubernetes.io/projected/d8b336ad-2646-4717-b581-f0c6d97d936e-kube-api-access-89wzz\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.035874 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.035690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8b336ad-2646-4717-b581-f0c6d97d936e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.035874 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.035757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.035874 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.035837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8b336ad-2646-4717-b581-f0c6d97d936e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.137158 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.137062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8b336ad-2646-4717-b581-f0c6d97d936e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.137158 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.137118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.137158 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.137161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8b336ad-2646-4717-b581-f0c6d97d936e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.137462 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.137201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89wzz\" (UniqueName: \"kubernetes.io/projected/d8b336ad-2646-4717-b581-f0c6d97d936e-kube-api-access-89wzz\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.137462 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:12:57.137294 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 16 23:12:57.137462 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:12:57.137416 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls podName:d8b336ad-2646-4717-b581-f0c6d97d936e nodeName:}" failed. No retries permitted until 2026-04-16 23:12:57.637391359 +0000 UTC m=+3556.762015625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls") pod "isvc-sklearn-s3-predictor-88457d696-6fqtp" (UID: "d8b336ad-2646-4717-b581-f0c6d97d936e") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 16 23:12:57.137588 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.137528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8b336ad-2646-4717-b581-f0c6d97d936e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.137898 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.137876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8b336ad-2646-4717-b581-f0c6d97d936e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.146131 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.146097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wzz\" (UniqueName: \"kubernetes.io/projected/d8b336ad-2646-4717-b581-f0c6d97d936e-kube-api-access-89wzz\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.299112 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.299080 2576 generic.go:358] "Generic (PLEG): container finished" podID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerID="4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a" exitCode=2 Apr 16 23:12:57.299315 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.299150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerDied","Data":"4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a"} Apr 16 23:12:57.642268 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.642232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.644725 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.644705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-6fqtp\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.828711 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.828668 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:12:57.954720 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:57.954667 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp"] Apr 16 23:12:57.957206 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:12:57.957172 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b336ad_2646_4717_b581_f0c6d97d936e.slice/crio-33241ddc769c681c457cf8ca12925adee2be5dbe190dc2956c52d48549baf811 WatchSource:0}: Error finding container 33241ddc769c681c457cf8ca12925adee2be5dbe190dc2956c52d48549baf811: Status 404 returned error can't find the container with id 33241ddc769c681c457cf8ca12925adee2be5dbe190dc2956c52d48549baf811 Apr 16 23:12:58.304012 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:58.303975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerStarted","Data":"86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9"} Apr 16 23:12:58.304186 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:58.304017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerStarted","Data":"33241ddc769c681c457cf8ca12925adee2be5dbe190dc2956c52d48549baf811"} Apr 16 23:12:59.308451 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:59.308405 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerID="86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9" exitCode=0 Apr 16 23:12:59.308830 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:12:59.308465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerDied","Data":"86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9"} Apr 16 23:13:00.078579 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.078542 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.67:8643/healthz\": dial tcp 10.133.0.67:8643: connect: connection refused" Apr 16 23:13:00.083706 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.083681 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 23:13:00.313194 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.313159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerStarted","Data":"214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc"} Apr 16 23:13:00.313194 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.313194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerStarted","Data":"03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1"} Apr 16 23:13:00.313646 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.313343 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:13:00.333935 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.333845 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podStartSLOduration=4.333826872 podStartE2EDuration="4.333826872s" podCreationTimestamp="2026-04-16 23:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:13:00.332929789 +0000 UTC m=+3559.457554099" watchObservedRunningTime="2026-04-16 23:13:00.333826872 +0000 UTC m=+3559.458451161" Apr 16 23:13:00.575776 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.575753 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:13:00.664977 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.664902 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kserve-provision-location\") pod \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " Apr 16 23:13:00.664977 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.664966 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls\") pod \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " Apr 16 23:13:00.665198 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.664986 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " Apr 16 23:13:00.665198 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.665006 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89jwb\" (UniqueName: \"kubernetes.io/projected/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kube-api-access-89jwb\") pod \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\" (UID: \"49a9a0a8-4a30-44b1-abb7-eeadffcc2922\") " Apr 16 23:13:00.665308 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.665251 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "49a9a0a8-4a30-44b1-abb7-eeadffcc2922" (UID: "49a9a0a8-4a30-44b1-abb7-eeadffcc2922"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:13:00.665398 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.665349 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "49a9a0a8-4a30-44b1-abb7-eeadffcc2922" (UID: "49a9a0a8-4a30-44b1-abb7-eeadffcc2922"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:13:00.667016 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.666994 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "49a9a0a8-4a30-44b1-abb7-eeadffcc2922" (UID: "49a9a0a8-4a30-44b1-abb7-eeadffcc2922"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:13:00.667106 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.667033 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kube-api-access-89jwb" (OuterVolumeSpecName: "kube-api-access-89jwb") pod "49a9a0a8-4a30-44b1-abb7-eeadffcc2922" (UID: "49a9a0a8-4a30-44b1-abb7-eeadffcc2922"). InnerVolumeSpecName "kube-api-access-89jwb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:13:00.765919 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.765879 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:13:00.765919 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.765912 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:13:00.765919 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.765923 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:13:00.765919 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:00.765932 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-89jwb\" (UniqueName: \"kubernetes.io/projected/49a9a0a8-4a30-44b1-abb7-eeadffcc2922-kube-api-access-89jwb\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:13:01.317697 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.317658 2576 generic.go:358] "Generic (PLEG): container finished" podID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerID="e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331" exitCode=0 Apr 16 23:13:01.318095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.317743 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" Apr 16 23:13:01.318095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.317739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerDied","Data":"e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331"} Apr 16 23:13:01.318095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.317837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh" event={"ID":"49a9a0a8-4a30-44b1-abb7-eeadffcc2922","Type":"ContainerDied","Data":"ab881f38eca7961598545eb11d8b34fd51819315dfa6a257afe0935d714c95d4"} Apr 16 23:13:01.318095 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.317852 2576 scope.go:117] "RemoveContainer" containerID="4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a" Apr 16 23:13:01.318319 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.318300 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:13:01.319766 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.319741 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:13:01.326051 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.326024 2576 scope.go:117] "RemoveContainer" containerID="e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331" Apr 16 23:13:01.333364 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.333347 2576 scope.go:117] "RemoveContainer" containerID="d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73" Apr 16 23:13:01.337978 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.337955 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh"] Apr 16 23:13:01.341200 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.341172 2576 scope.go:117] "RemoveContainer" containerID="4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a" Apr 16 23:13:01.341543 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:13:01.341522 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a\": container with ID starting with 4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a not found: ID does not exist" containerID="4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a" Apr 16 23:13:01.341645 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.341556 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a"} err="failed to get container status \"4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a\": rpc error: code = NotFound desc = could not find container \"4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a\": container with ID starting with 4e1ba738abdfe8fa433d30c82075cbef487b177a985ea2a48d34aeb27853af7a not found: ID does not exist" Apr 16 23:13:01.341645 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.341581 2576 scope.go:117] "RemoveContainer" containerID="e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331" Apr 16 23:13:01.341917 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:13:01.341890 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331\": container with ID starting with e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331 not found: ID does not exist" containerID="e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331" Apr 16 23:13:01.341997 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.341926 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331"} err="failed to get container status \"e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331\": rpc error: code = NotFound desc = could not find container \"e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331\": container with ID starting with e893e2dbf30b9a59998b9788c2809368fb219e72f2612be4cf965062eca3d331 not found: ID does not exist" Apr 16 23:13:01.341997 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.341952 2576 scope.go:117] "RemoveContainer" containerID="d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73" Apr 16 23:13:01.342180 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.342163 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-xctlh"] Apr 16 23:13:01.342234 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:13:01.342211 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73\": container with ID starting with d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73 not found: ID does not exist" containerID="d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73" Apr 16 23:13:01.342273 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.342234 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73"} err="failed to get container status \"d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73\": rpc error: code = NotFound desc = could not find container \"d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73\": container with ID starting with d3b9ffb70447295e2ac107aba7bcdea6d55bd5d0517d177eb0c204e7464aba73 not found: ID does not exist" Apr 16 23:13:01.452629 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:01.452596 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" path="/var/lib/kubelet/pods/49a9a0a8-4a30-44b1-abb7-eeadffcc2922/volumes" Apr 16 23:13:02.320595 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:02.320552 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:13:07.325813 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:07.325783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:13:07.326409 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:07.326379 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:13:17.327065 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:17.327029 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:13:27.327415 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:27.327369 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:13:37.326657 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:37.326617 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:13:47.326656 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:47.326613 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:13:57.326869 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:13:57.326825 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:14:07.327569 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:07.327537 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:14:17.071826 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.071783 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp"] Apr 16 23:14:17.074285 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.072120 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" containerID="cri-o://03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1" gracePeriod=30 Apr 16 23:14:17.074285 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.072155 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kube-rbac-proxy" containerID="cri-o://214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc" gracePeriod=30 Apr 16 23:14:17.153018 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.152983 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg"] Apr 16 23:14:17.153297 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153285 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kube-rbac-proxy" Apr 16 23:14:17.153358 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153299 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kube-rbac-proxy" Apr 16 23:14:17.153358 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153310 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" Apr 16 23:14:17.153358 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153316 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" Apr 16 23:14:17.153358 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153346 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="storage-initializer" Apr 16 23:14:17.153358 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153353 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="storage-initializer" Apr 16 23:14:17.153521 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153405 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kube-rbac-proxy" Apr 16 23:14:17.153521 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.153412 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="49a9a0a8-4a30-44b1-abb7-eeadffcc2922" containerName="kserve-container" Apr 16 23:14:17.156526 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.156505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.158985 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.158965 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 16 23:14:17.159094 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.159003 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 16 23:14:17.159094 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.159020 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 23:14:17.166001 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.165977 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg"] Apr 16 23:14:17.249128 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.249081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.249366 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.249159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07efe955-2360-4ece-b0b8-7847bee6ff75-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.249366 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.249184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl8gh\" (UniqueName: \"kubernetes.io/projected/07efe955-2360-4ece-b0b8-7847bee6ff75-kube-api-access-tl8gh\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.249366 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.249205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.249366 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.249226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07efe955-2360-4ece-b0b8-7847bee6ff75-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.321467 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.321428 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.68:8643/healthz\": dial tcp 10.133.0.68:8643: connect: connection refused" Apr 16 23:14:17.326787 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.326730 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 23:14:17.350459 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.350418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.350620 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.350577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07efe955-2360-4ece-b0b8-7847bee6ff75-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.350686 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.350623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl8gh\" (UniqueName: \"kubernetes.io/projected/07efe955-2360-4ece-b0b8-7847bee6ff75-kube-api-access-tl8gh\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.350686 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.350660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.350801 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.350691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07efe955-2360-4ece-b0b8-7847bee6ff75-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.351072 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.351046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07efe955-2360-4ece-b0b8-7847bee6ff75-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.351174 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.351151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.351265 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.351249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.353047 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.353017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07efe955-2360-4ece-b0b8-7847bee6ff75-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.357909 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.357887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl8gh\" (UniqueName: \"kubernetes.io/projected/07efe955-2360-4ece-b0b8-7847bee6ff75-kube-api-access-tl8gh\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.467028 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.466991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:17.548524 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.548486 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerID="214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc" exitCode=2 Apr 16 23:14:17.548685 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.548529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerDied","Data":"214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc"} Apr 16 23:14:17.592365 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.592320 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg"] Apr 16 23:14:17.594783 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:14:17.594757 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07efe955_2360_4ece_b0b8_7847bee6ff75.slice/crio-5e1721ee62637c1556eb0d6e345fe062829ceb6d081d5fab960770fef518e13e WatchSource:0}: Error finding container 5e1721ee62637c1556eb0d6e345fe062829ceb6d081d5fab960770fef518e13e: Status 404 returned error can't find the container with id 5e1721ee62637c1556eb0d6e345fe062829ceb6d081d5fab960770fef518e13e Apr 16 23:14:17.596499 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:17.596484 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:14:18.553935 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:18.553898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerStarted","Data":"83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997"} Apr 16 23:14:18.554401 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:18.553942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerStarted","Data":"5e1721ee62637c1556eb0d6e345fe062829ceb6d081d5fab960770fef518e13e"} Apr 16 23:14:19.557805 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:19.557768 2576 generic.go:358] "Generic (PLEG): container finished" podID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerID="83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997" exitCode=0 Apr 16 23:14:19.558185 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:19.557857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerDied","Data":"83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997"} Apr 16 23:14:20.562822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:20.562790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerStarted","Data":"5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a"} Apr 16 23:14:20.562822 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:20.562823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerStarted","Data":"50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b"} Apr 16 23:14:20.563368 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:20.562911 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:20.581603 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:20.581558 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podStartSLOduration=3.581541706 podStartE2EDuration="3.581541706s" podCreationTimestamp="2026-04-16 23:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:14:20.580455949 +0000 UTC m=+3639.705080230" watchObservedRunningTime="2026-04-16 23:14:20.581541706 +0000 UTC m=+3639.706165993" Apr 16 23:14:21.566151 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:21.566114 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:21.567399 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:21.567370 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:14:22.120600 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.120579 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:14:22.188889 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.188857 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8b336ad-2646-4717-b581-f0c6d97d936e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"d8b336ad-2646-4717-b581-f0c6d97d936e\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " Apr 16 23:14:22.189077 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.188908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls\") pod \"d8b336ad-2646-4717-b581-f0c6d97d936e\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " Apr 16 23:14:22.189077 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.188942 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89wzz\" (UniqueName: \"kubernetes.io/projected/d8b336ad-2646-4717-b581-f0c6d97d936e-kube-api-access-89wzz\") pod \"d8b336ad-2646-4717-b581-f0c6d97d936e\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " Apr 16 23:14:22.189077 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.188980 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8b336ad-2646-4717-b581-f0c6d97d936e-kserve-provision-location\") pod \"d8b336ad-2646-4717-b581-f0c6d97d936e\" (UID: \"d8b336ad-2646-4717-b581-f0c6d97d936e\") " Apr 16 23:14:22.189342 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.189296 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b336ad-2646-4717-b581-f0c6d97d936e-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "d8b336ad-2646-4717-b581-f0c6d97d936e" (UID: "d8b336ad-2646-4717-b581-f0c6d97d936e"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:14:22.189411 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.189389 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8b336ad-2646-4717-b581-f0c6d97d936e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d8b336ad-2646-4717-b581-f0c6d97d936e" (UID: "d8b336ad-2646-4717-b581-f0c6d97d936e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:14:22.191050 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.191028 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b336ad-2646-4717-b581-f0c6d97d936e-kube-api-access-89wzz" (OuterVolumeSpecName: "kube-api-access-89wzz") pod "d8b336ad-2646-4717-b581-f0c6d97d936e" (UID: "d8b336ad-2646-4717-b581-f0c6d97d936e"). InnerVolumeSpecName "kube-api-access-89wzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:14:22.191147 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.191057 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d8b336ad-2646-4717-b581-f0c6d97d936e" (UID: "d8b336ad-2646-4717-b581-f0c6d97d936e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:14:22.290342 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.290283 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8b336ad-2646-4717-b581-f0c6d97d936e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:14:22.290342 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.290314 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8b336ad-2646-4717-b581-f0c6d97d936e-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:14:22.290342 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.290346 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-89wzz\" (UniqueName: \"kubernetes.io/projected/d8b336ad-2646-4717-b581-f0c6d97d936e-kube-api-access-89wzz\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:14:22.290578 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.290357 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8b336ad-2646-4717-b581-f0c6d97d936e-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:14:22.570579 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.570491 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerID="03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1" exitCode=0 Apr 16 23:14:22.570579 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.570566 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" Apr 16 23:14:22.571073 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.570578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerDied","Data":"03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1"} Apr 16 23:14:22.571073 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.570618 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp" event={"ID":"d8b336ad-2646-4717-b581-f0c6d97d936e","Type":"ContainerDied","Data":"33241ddc769c681c457cf8ca12925adee2be5dbe190dc2956c52d48549baf811"} Apr 16 23:14:22.571073 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.570634 2576 scope.go:117] "RemoveContainer" containerID="214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc" Apr 16 23:14:22.571310 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.571289 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:14:22.578740 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.578719 2576 scope.go:117] "RemoveContainer" containerID="03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1" Apr 16 23:14:22.591479 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.591437 2576 scope.go:117] "RemoveContainer" containerID="86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9" Apr 16 23:14:22.592785 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.592760 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp"] Apr 16 23:14:22.595448 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.595426 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-6fqtp"] Apr 16 23:14:22.599225 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.599209 2576 scope.go:117] "RemoveContainer" containerID="214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc" Apr 16 23:14:22.599531 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:14:22.599512 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc\": container with ID starting with 214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc not found: ID does not exist" containerID="214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc" Apr 16 23:14:22.599585 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.599542 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc"} err="failed to get container status \"214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc\": rpc error: code = NotFound desc = could not find container \"214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc\": container with ID starting with 214a9a151021a28a5d82807878045e1f5a6958f5d6408ee867de639a756b82bc not found: ID does not exist" Apr 16 23:14:22.599585 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.599560 2576 scope.go:117] "RemoveContainer" containerID="03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1" Apr 16 23:14:22.599787 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:14:22.599771 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1\": container with ID starting with 03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1 not found: ID does not exist" containerID="03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1" Apr 16 23:14:22.599827 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.599793 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1"} err="failed to get container status \"03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1\": rpc error: code = NotFound desc = could not find container \"03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1\": container with ID starting with 03dd084b5b3bcb62b8124b9850f423fc3b4e3b6946da01b5a667a01a270e1ad1 not found: ID does not exist" Apr 16 23:14:22.599827 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.599809 2576 scope.go:117] "RemoveContainer" containerID="86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9" Apr 16 23:14:22.600033 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:14:22.600015 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9\": container with ID starting with 86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9 not found: ID does not exist" containerID="86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9" Apr 16 23:14:22.600090 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:22.600041 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9"} err="failed to get container status \"86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9\": rpc error: code = NotFound desc = could not find container \"86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9\": container with ID starting with 86c7101f9f64cd09a14ce17d2e6afbe6aae51f11611dd3b73f64e4a965a93ce9 not found: ID does not exist" Apr 16 23:14:23.453514 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:23.453482 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" path="/var/lib/kubelet/pods/d8b336ad-2646-4717-b581-f0c6d97d936e/volumes" Apr 16 23:14:27.575975 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:27.575944 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:14:27.576523 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:27.576496 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:14:37.577127 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:37.577088 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:14:47.576865 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:47.576828 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:14:57.576531 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:14:57.576487 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:15:07.576664 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:07.576623 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:15:17.576676 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:17.576639 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:15:27.577636 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:27.577597 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:15:37.171388 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:37.171335 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg"] Apr 16 23:15:37.171927 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:37.171689 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" containerID="cri-o://50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b" gracePeriod=30 Apr 16 23:15:37.171927 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:37.171741 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kube-rbac-proxy" containerID="cri-o://5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a" gracePeriod=30 Apr 16 23:15:37.571610 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:37.571559 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.69:8643/healthz\": dial tcp 10.133.0.69:8643: connect: connection refused" Apr 16 23:15:37.577383 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:37.577352 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 23:15:37.803411 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:37.803379 2576 generic.go:358] "Generic (PLEG): container finished" podID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerID="5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a" exitCode=2 Apr 16 23:15:37.803574 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:37.803452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerDied","Data":"5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a"} Apr 16 23:15:38.242121 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242082 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l"] Apr 16 23:15:38.242679 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242657 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" Apr 16 23:15:38.242725 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242683 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" Apr 16 23:15:38.242725 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242701 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="storage-initializer" Apr 16 23:15:38.242725 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242710 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="storage-initializer" Apr 16 23:15:38.242838 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242727 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kube-rbac-proxy" Apr 16 23:15:38.242838 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242737 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kube-rbac-proxy" Apr 16 23:15:38.242838 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242825 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kserve-container" Apr 16 23:15:38.242934 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.242836 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8b336ad-2646-4717-b581-f0c6d97d936e" containerName="kube-rbac-proxy" Apr 16 23:15:38.246372 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.246354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.248749 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.248726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 16 23:15:38.248749 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.248725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 16 23:15:38.252293 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.252267 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l"] Apr 16 23:15:38.397581 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.397547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9f060d0-fa40-44e4-8f4a-dd989af97836-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.397773 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.397599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9f060d0-fa40-44e4-8f4a-dd989af97836-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.397773 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.397650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbv4c\" (UniqueName: \"kubernetes.io/projected/a9f060d0-fa40-44e4-8f4a-dd989af97836-kube-api-access-rbv4c\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.397773 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.397685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9f060d0-fa40-44e4-8f4a-dd989af97836-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.498390 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.498290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9f060d0-fa40-44e4-8f4a-dd989af97836-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.498390 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.498374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9f060d0-fa40-44e4-8f4a-dd989af97836-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.498624 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.498431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9f060d0-fa40-44e4-8f4a-dd989af97836-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.498624 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.498467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbv4c\" (UniqueName: \"kubernetes.io/projected/a9f060d0-fa40-44e4-8f4a-dd989af97836-kube-api-access-rbv4c\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.498842 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.498818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9f060d0-fa40-44e4-8f4a-dd989af97836-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.499111 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.499085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9f060d0-fa40-44e4-8f4a-dd989af97836-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.501002 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.500976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9f060d0-fa40-44e4-8f4a-dd989af97836-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.506050 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.506029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbv4c\" (UniqueName: \"kubernetes.io/projected/a9f060d0-fa40-44e4-8f4a-dd989af97836-kube-api-access-rbv4c\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.557446 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.557410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:38.680512 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.680297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l"] Apr 16 23:15:38.683229 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:15:38.683201 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f060d0_fa40_44e4_8f4a_dd989af97836.slice/crio-3e8ed6afb5f5164c88d9f3b82ec20f05ed11ce5e39c24ff75356501f32915e1f WatchSource:0}: Error finding container 3e8ed6afb5f5164c88d9f3b82ec20f05ed11ce5e39c24ff75356501f32915e1f: Status 404 returned error can't find the container with id 3e8ed6afb5f5164c88d9f3b82ec20f05ed11ce5e39c24ff75356501f32915e1f Apr 16 23:15:38.810447 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.810361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" event={"ID":"a9f060d0-fa40-44e4-8f4a-dd989af97836","Type":"ContainerStarted","Data":"0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9"} Apr 16 23:15:38.810447 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:38.810397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" event={"ID":"a9f060d0-fa40-44e4-8f4a-dd989af97836","Type":"ContainerStarted","Data":"3e8ed6afb5f5164c88d9f3b82ec20f05ed11ce5e39c24ff75356501f32915e1f"} Apr 16 23:15:41.618747 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.618726 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:15:41.722182 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722150 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl8gh\" (UniqueName: \"kubernetes.io/projected/07efe955-2360-4ece-b0b8-7847bee6ff75-kube-api-access-tl8gh\") pod \"07efe955-2360-4ece-b0b8-7847bee6ff75\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " Apr 16 23:15:41.722380 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722199 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-cabundle-cert\") pod \"07efe955-2360-4ece-b0b8-7847bee6ff75\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " Apr 16 23:15:41.722380 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722232 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07efe955-2360-4ece-b0b8-7847bee6ff75-kserve-provision-location\") pod \"07efe955-2360-4ece-b0b8-7847bee6ff75\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " Apr 16 23:15:41.722380 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722273 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"07efe955-2360-4ece-b0b8-7847bee6ff75\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " Apr 16 23:15:41.722380 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722307 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07efe955-2360-4ece-b0b8-7847bee6ff75-proxy-tls\") pod \"07efe955-2360-4ece-b0b8-7847bee6ff75\" (UID: \"07efe955-2360-4ece-b0b8-7847bee6ff75\") " Apr 16 23:15:41.722661 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722639 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07efe955-2360-4ece-b0b8-7847bee6ff75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07efe955-2360-4ece-b0b8-7847bee6ff75" (UID: "07efe955-2360-4ece-b0b8-7847bee6ff75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:15:41.722724 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722680 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "07efe955-2360-4ece-b0b8-7847bee6ff75" (UID: "07efe955-2360-4ece-b0b8-7847bee6ff75"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:15:41.722724 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.722677 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "07efe955-2360-4ece-b0b8-7847bee6ff75" (UID: "07efe955-2360-4ece-b0b8-7847bee6ff75"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:15:41.724396 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.724376 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07efe955-2360-4ece-b0b8-7847bee6ff75-kube-api-access-tl8gh" (OuterVolumeSpecName: "kube-api-access-tl8gh") pod "07efe955-2360-4ece-b0b8-7847bee6ff75" (UID: "07efe955-2360-4ece-b0b8-7847bee6ff75"). InnerVolumeSpecName "kube-api-access-tl8gh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:15:41.724503 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.724422 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07efe955-2360-4ece-b0b8-7847bee6ff75-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "07efe955-2360-4ece-b0b8-7847bee6ff75" (UID: "07efe955-2360-4ece-b0b8-7847bee6ff75"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:15:41.821201 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.821116 2576 generic.go:358] "Generic (PLEG): container finished" podID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerID="50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b" exitCode=0 Apr 16 23:15:41.821383 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.821199 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" Apr 16 23:15:41.821383 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.821199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerDied","Data":"50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b"} Apr 16 23:15:41.821383 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.821240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg" event={"ID":"07efe955-2360-4ece-b0b8-7847bee6ff75","Type":"ContainerDied","Data":"5e1721ee62637c1556eb0d6e345fe062829ceb6d081d5fab960770fef518e13e"} Apr 16 23:15:41.821383 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.821260 2576 scope.go:117] "RemoveContainer" containerID="5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a" Apr 16 23:15:41.822582 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_a9f060d0-fa40-44e4-8f4a-dd989af97836/storage-initializer/0.log" Apr 16 23:15:41.822722 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822593 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerID="0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9" exitCode=1 Apr 16 23:15:41.822722 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" event={"ID":"a9f060d0-fa40-44e4-8f4a-dd989af97836","Type":"ContainerDied","Data":"0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9"} Apr 16 23:15:41.822880 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822861 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-cabundle-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:41.822945 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822885 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07efe955-2360-4ece-b0b8-7847bee6ff75-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:41.822945 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822924 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07efe955-2360-4ece-b0b8-7847bee6ff75-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:41.822945 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822941 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07efe955-2360-4ece-b0b8-7847bee6ff75-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:41.823121 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.822960 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tl8gh\" (UniqueName: \"kubernetes.io/projected/07efe955-2360-4ece-b0b8-7847bee6ff75-kube-api-access-tl8gh\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:41.829381 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.829357 2576 scope.go:117] "RemoveContainer" containerID="50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b" Apr 16 23:15:41.836863 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.836841 2576 scope.go:117] "RemoveContainer" containerID="83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997" Apr 16 23:15:41.845293 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.845276 2576 scope.go:117] "RemoveContainer" containerID="5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a" Apr 16 23:15:41.845586 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:41.845566 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a\": container with ID starting with 5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a not found: ID does not exist" containerID="5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a" Apr 16 23:15:41.845641 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.845594 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a"} err="failed to get container status \"5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a\": rpc error: code = NotFound desc = could not find container \"5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a\": container with ID starting with 5097c507c8803f4ca4a63af14f5de7a9fddb58211904411f15a64550a685118a not found: ID does not exist" Apr 16 23:15:41.845641 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.845612 2576 scope.go:117] "RemoveContainer" containerID="50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b" Apr 16 23:15:41.845867 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:41.845850 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b\": container with ID starting with 50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b not found: ID does not exist" containerID="50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b" Apr 16 23:15:41.845910 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.845875 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b"} err="failed to get container status \"50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b\": rpc error: code = NotFound desc = could not find container \"50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b\": container with ID starting with 50d956b777aed4e64bb5061f5c6cdbd8395a1461b6ae35863618042ada53774b not found: ID does not exist" Apr 16 23:15:41.845910 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.845892 2576 scope.go:117] "RemoveContainer" containerID="83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997" Apr 16 23:15:41.846104 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:41.846088 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997\": container with ID starting with 83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997 not found: ID does not exist" containerID="83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997" Apr 16 23:15:41.846143 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.846107 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997"} err="failed to get container status \"83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997\": rpc error: code = NotFound desc = could not find container \"83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997\": container with ID starting with 83daba132cfa957c726ed1b919355e1d2ae64175b9e1d6828f6f4dff59bed997 not found: ID does not exist" Apr 16 23:15:41.853452 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.853432 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg"] Apr 16 23:15:41.856640 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:41.856618 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-rkvjg"] Apr 16 23:15:42.827889 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:42.827862 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_a9f060d0-fa40-44e4-8f4a-dd989af97836/storage-initializer/0.log" Apr 16 23:15:42.828268 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:42.827940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" event={"ID":"a9f060d0-fa40-44e4-8f4a-dd989af97836","Type":"ContainerStarted","Data":"8e7928e7b7b4dc832f5ae322baa5c3a76de6de879c79ba5219d530135ea531e3"} Apr 16 23:15:43.453716 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:43.453685 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" path="/var/lib/kubelet/pods/07efe955-2360-4ece-b0b8-7847bee6ff75/volumes" Apr 16 23:15:46.841058 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:46.841029 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_a9f060d0-fa40-44e4-8f4a-dd989af97836/storage-initializer/1.log" Apr 16 23:15:46.841546 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:46.841420 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_a9f060d0-fa40-44e4-8f4a-dd989af97836/storage-initializer/0.log" Apr 16 23:15:46.841546 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:46.841462 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerID="8e7928e7b7b4dc832f5ae322baa5c3a76de6de879c79ba5219d530135ea531e3" exitCode=1 Apr 16 23:15:46.841546 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:46.841530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" event={"ID":"a9f060d0-fa40-44e4-8f4a-dd989af97836","Type":"ContainerDied","Data":"8e7928e7b7b4dc832f5ae322baa5c3a76de6de879c79ba5219d530135ea531e3"} Apr 16 23:15:46.841687 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:46.841566 2576 scope.go:117] "RemoveContainer" containerID="0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9" Apr 16 23:15:46.841992 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:46.841974 2576 scope.go:117] "RemoveContainer" containerID="0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9" Apr 16 23:15:46.851904 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:46.851876 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_kserve-ci-e2e-test_a9f060d0-fa40-44e4-8f4a-dd989af97836_0 in pod sandbox 3e8ed6afb5f5164c88d9f3b82ec20f05ed11ce5e39c24ff75356501f32915e1f from index: no such id: '0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9'" containerID="0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9" Apr 16 23:15:46.851988 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:46.851925 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_kserve-ci-e2e-test_a9f060d0-fa40-44e4-8f4a-dd989af97836_0 in pod sandbox 3e8ed6afb5f5164c88d9f3b82ec20f05ed11ce5e39c24ff75356501f32915e1f from index: no such id: '0724016aec3dfaca03c00afe01ddf4b696906b8c541599bd8fd35a917612d9a9'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_kserve-ci-e2e-test(a9f060d0-fa40-44e4-8f4a-dd989af97836)\"" logger="UnhandledError" Apr 16 23:15:46.853244 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:46.853224 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_kserve-ci-e2e-test(a9f060d0-fa40-44e4-8f4a-dd989af97836)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" Apr 16 23:15:47.845593 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:47.845563 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_a9f060d0-fa40-44e4-8f4a-dd989af97836/storage-initializer/1.log" Apr 16 23:15:48.232060 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.232027 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l"] Apr 16 23:15:48.369953 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.369926 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_a9f060d0-fa40-44e4-8f4a-dd989af97836/storage-initializer/1.log" Apr 16 23:15:48.370088 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.369991 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:48.477653 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.477624 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9f060d0-fa40-44e4-8f4a-dd989af97836-proxy-tls\") pod \"a9f060d0-fa40-44e4-8f4a-dd989af97836\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " Apr 16 23:15:48.477653 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.477659 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbv4c\" (UniqueName: \"kubernetes.io/projected/a9f060d0-fa40-44e4-8f4a-dd989af97836-kube-api-access-rbv4c\") pod \"a9f060d0-fa40-44e4-8f4a-dd989af97836\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " Apr 16 23:15:48.477884 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.477702 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9f060d0-fa40-44e4-8f4a-dd989af97836-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"a9f060d0-fa40-44e4-8f4a-dd989af97836\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " Apr 16 23:15:48.477933 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.477883 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9f060d0-fa40-44e4-8f4a-dd989af97836-kserve-provision-location\") pod \"a9f060d0-fa40-44e4-8f4a-dd989af97836\" (UID: \"a9f060d0-fa40-44e4-8f4a-dd989af97836\") " Apr 16 23:15:48.478025 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.478005 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f060d0-fa40-44e4-8f4a-dd989af97836-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "a9f060d0-fa40-44e4-8f4a-dd989af97836" (UID: "a9f060d0-fa40-44e4-8f4a-dd989af97836"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:15:48.478145 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.478122 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f060d0-fa40-44e4-8f4a-dd989af97836-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9f060d0-fa40-44e4-8f4a-dd989af97836" (UID: "a9f060d0-fa40-44e4-8f4a-dd989af97836"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:15:48.478209 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.478144 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9f060d0-fa40-44e4-8f4a-dd989af97836-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:48.479729 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.479708 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f060d0-fa40-44e4-8f4a-dd989af97836-kube-api-access-rbv4c" (OuterVolumeSpecName: "kube-api-access-rbv4c") pod "a9f060d0-fa40-44e4-8f4a-dd989af97836" (UID: "a9f060d0-fa40-44e4-8f4a-dd989af97836"). InnerVolumeSpecName "kube-api-access-rbv4c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:15:48.479894 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.479869 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f060d0-fa40-44e4-8f4a-dd989af97836-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a9f060d0-fa40-44e4-8f4a-dd989af97836" (UID: "a9f060d0-fa40-44e4-8f4a-dd989af97836"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:15:48.578711 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.578673 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9f060d0-fa40-44e4-8f4a-dd989af97836-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:48.578711 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.578705 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9f060d0-fa40-44e4-8f4a-dd989af97836-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:48.578711 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.578715 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbv4c\" (UniqueName: \"kubernetes.io/projected/a9f060d0-fa40-44e4-8f4a-dd989af97836-kube-api-access-rbv4c\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:15:48.850053 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.849974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l_a9f060d0-fa40-44e4-8f4a-dd989af97836/storage-initializer/1.log" Apr 16 23:15:48.850607 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.850070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" event={"ID":"a9f060d0-fa40-44e4-8f4a-dd989af97836","Type":"ContainerDied","Data":"3e8ed6afb5f5164c88d9f3b82ec20f05ed11ce5e39c24ff75356501f32915e1f"} Apr 16 23:15:48.850607 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.850096 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l" Apr 16 23:15:48.850607 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.850108 2576 scope.go:117] "RemoveContainer" containerID="8e7928e7b7b4dc832f5ae322baa5c3a76de6de879c79ba5219d530135ea531e3" Apr 16 23:15:48.882857 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.882828 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l"] Apr 16 23:15:48.885939 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:48.885909 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-7fb4l"] Apr 16 23:15:49.309984 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.309948 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l"] Apr 16 23:15:49.310265 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310252 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerName="storage-initializer" Apr 16 23:15:49.310351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310267 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerName="storage-initializer" Apr 16 23:15:49.310351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310280 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerName="storage-initializer" Apr 16 23:15:49.310351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310289 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerName="storage-initializer" Apr 16 23:15:49.310351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310307 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kube-rbac-proxy" Apr 16 23:15:49.310351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310313 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kube-rbac-proxy" Apr 16 23:15:49.310351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310339 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" Apr 16 23:15:49.310351 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310348 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" Apr 16 23:15:49.310602 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310356 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="storage-initializer" Apr 16 23:15:49.310602 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310362 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="storage-initializer" Apr 16 23:15:49.310602 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310407 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kube-rbac-proxy" Apr 16 23:15:49.310602 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310418 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerName="storage-initializer" Apr 16 23:15:49.310602 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310429 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="07efe955-2360-4ece-b0b8-7847bee6ff75" containerName="kserve-container" Apr 16 23:15:49.310602 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.310436 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" containerName="storage-initializer" Apr 16 23:15:49.315146 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.315126 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.317564 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.317539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 23:15:49.317564 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.317558 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 16 23:15:49.317765 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.317546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-f9tz9\"" Apr 16 23:15:49.317765 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.317651 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 23:15:49.318390 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.318374 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 23:15:49.318390 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.318384 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 23:15:49.318478 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.318383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 16 23:15:49.323529 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.323506 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l"] Apr 16 23:15:49.456570 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.454254 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f060d0-fa40-44e4-8f4a-dd989af97836" path="/var/lib/kubelet/pods/a9f060d0-fa40-44e4-8f4a-dd989af97836/volumes" Apr 16 23:15:49.485079 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.485040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6lvc\" (UniqueName: \"kubernetes.io/projected/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kube-api-access-t6lvc\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.485241 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.485100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.485241 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.485161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.485241 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.485191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.485241 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.485211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.585840 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.585747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.585840 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.585800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.585840 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.585830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.586107 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.585848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.586107 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.585890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6lvc\" (UniqueName: \"kubernetes.io/projected/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kube-api-access-t6lvc\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.586107 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:49.585986 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 16 23:15:49.586107 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:15:49.586052 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls podName:4e6b32a3-babc-43ea-980b-19d64b06ddc8 nodeName:}" failed. No retries permitted until 2026-04-16 23:15:50.086031929 +0000 UTC m=+3729.210656210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls") pod "isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" (UID: "4e6b32a3-babc-43ea-980b-19d64b06ddc8") : secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 16 23:15:49.586369 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.586314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.586551 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.586533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.586593 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.586565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:49.594201 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:49.594170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6lvc\" (UniqueName: \"kubernetes.io/projected/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kube-api-access-t6lvc\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:50.089639 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:50.089601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:50.092056 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:50.092029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:50.226486 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:50.226433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:50.351730 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:50.351616 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l"] Apr 16 23:15:50.354433 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:15:50.354398 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e6b32a3_babc_43ea_980b_19d64b06ddc8.slice/crio-36f5efd93b21ece04d536829efd0c3d94ee96426ba1ab76d21f77441e1cabbf4 WatchSource:0}: Error finding container 36f5efd93b21ece04d536829efd0c3d94ee96426ba1ab76d21f77441e1cabbf4: Status 404 returned error can't find the container with id 36f5efd93b21ece04d536829efd0c3d94ee96426ba1ab76d21f77441e1cabbf4 Apr 16 23:15:50.859355 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:50.859300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerStarted","Data":"687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2"} Apr 16 23:15:50.859355 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:50.859357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerStarted","Data":"36f5efd93b21ece04d536829efd0c3d94ee96426ba1ab76d21f77441e1cabbf4"} Apr 16 23:15:51.863211 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:51.863176 2576 generic.go:358] "Generic (PLEG): container finished" podID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerID="687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2" exitCode=0 Apr 16 23:15:51.863632 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:51.863243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerDied","Data":"687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2"} Apr 16 23:15:52.867947 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:52.867909 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerStarted","Data":"f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0"} Apr 16 23:15:52.867947 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:52.867950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerStarted","Data":"c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd"} Apr 16 23:15:52.868544 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:52.868063 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:52.886420 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:52.886364 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podStartSLOduration=3.886348419 podStartE2EDuration="3.886348419s" podCreationTimestamp="2026-04-16 23:15:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:15:52.884936987 +0000 UTC m=+3732.009561273" watchObservedRunningTime="2026-04-16 23:15:52.886348419 +0000 UTC m=+3732.010972700" Apr 16 23:15:53.871117 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:53.871085 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:53.872236 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:53.872205 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:15:54.873713 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:54.873670 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:15:59.878342 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:59.878291 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:15:59.878937 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:15:59.878908 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:16:09.878870 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:16:09.878828 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:16:19.879611 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:16:19.879572 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:16:29.879425 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:16:29.879315 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:16:39.878977 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:16:39.878938 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:16:49.879350 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:16:49.879298 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:16:59.879377 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:16:59.879346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:17:09.349722 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:09.349690 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l"] Apr 16 23:17:09.350276 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:09.350022 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" containerID="cri-o://c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd" gracePeriod=30 Apr 16 23:17:09.350276 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:09.350109 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kube-rbac-proxy" containerID="cri-o://f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0" gracePeriod=30 Apr 16 23:17:09.874645 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:09.874601 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.71:8643/healthz\": dial tcp 10.133.0.71:8643: connect: connection refused" Apr 16 23:17:09.878873 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:09.878845 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 23:17:10.109346 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.109294 2576 generic.go:358] "Generic (PLEG): container finished" podID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerID="f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0" exitCode=2 Apr 16 23:17:10.109516 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.109361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerDied","Data":"f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0"} Apr 16 23:17:10.427069 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.427029 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj"] Apr 16 23:17:10.431024 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.431002 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.431531 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.431505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.431649 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.431565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9k2n\" (UniqueName: \"kubernetes.io/projected/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kube-api-access-h9k2n\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.431649 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.431630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.431786 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.431649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.433488 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.433462 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 16 23:17:10.433617 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.433504 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 16 23:17:10.439541 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.439294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj"] Apr 16 23:17:10.532135 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.532095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.532389 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.532169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9k2n\" (UniqueName: \"kubernetes.io/projected/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kube-api-access-h9k2n\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.532389 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.532209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.532389 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.532271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.532732 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.532698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.532948 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.532929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.534612 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.534582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.540553 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.540527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9k2n\" (UniqueName: \"kubernetes.io/projected/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kube-api-access-h9k2n\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.743500 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.743459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:10.875599 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:10.875565 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj"] Apr 16 23:17:10.878084 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:17:10.878058 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203c61ab_db79_4dd4_b2a9_07a3ef6dfe4e.slice/crio-afa1736eefb122d09470dc74dd4476d1eb289e0c48cc95524e2ffc6841018b77 WatchSource:0}: Error finding container afa1736eefb122d09470dc74dd4476d1eb289e0c48cc95524e2ffc6841018b77: Status 404 returned error can't find the container with id afa1736eefb122d09470dc74dd4476d1eb289e0c48cc95524e2ffc6841018b77 Apr 16 23:17:11.114268 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:11.114173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" event={"ID":"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e","Type":"ContainerStarted","Data":"ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb"} Apr 16 23:17:11.114268 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:11.114216 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" event={"ID":"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e","Type":"ContainerStarted","Data":"afa1736eefb122d09470dc74dd4476d1eb289e0c48cc95524e2ffc6841018b77"} Apr 16 23:17:13.896574 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.896549 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:17:13.954433 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954402 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kserve-provision-location\") pod \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " Apr 16 23:17:13.954433 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954441 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-cabundle-cert\") pod \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " Apr 16 23:17:13.954688 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954486 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6lvc\" (UniqueName: \"kubernetes.io/projected/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kube-api-access-t6lvc\") pod \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " Apr 16 23:17:13.954688 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954517 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls\") pod \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " Apr 16 23:17:13.954688 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954546 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\" (UID: \"4e6b32a3-babc-43ea-980b-19d64b06ddc8\") " Apr 16 23:17:13.954897 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954870 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e6b32a3-babc-43ea-980b-19d64b06ddc8" (UID: "4e6b32a3-babc-43ea-980b-19d64b06ddc8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:17:13.954965 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954945 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4e6b32a3-babc-43ea-980b-19d64b06ddc8" (UID: "4e6b32a3-babc-43ea-980b-19d64b06ddc8"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:17:13.955007 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.954952 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "4e6b32a3-babc-43ea-980b-19d64b06ddc8" (UID: "4e6b32a3-babc-43ea-980b-19d64b06ddc8"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:17:13.956630 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.956606 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4e6b32a3-babc-43ea-980b-19d64b06ddc8" (UID: "4e6b32a3-babc-43ea-980b-19d64b06ddc8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:17:13.956739 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:13.956720 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kube-api-access-t6lvc" (OuterVolumeSpecName: "kube-api-access-t6lvc") pod "4e6b32a3-babc-43ea-980b-19d64b06ddc8" (UID: "4e6b32a3-babc-43ea-980b-19d64b06ddc8"). InnerVolumeSpecName "kube-api-access-t6lvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:17:14.055689 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.055600 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:14.055689 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.055629 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-cabundle-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:14.055689 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.055642 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6lvc\" (UniqueName: \"kubernetes.io/projected/4e6b32a3-babc-43ea-980b-19d64b06ddc8-kube-api-access-t6lvc\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:14.055689 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.055652 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e6b32a3-babc-43ea-980b-19d64b06ddc8-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:14.055689 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.055662 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e6b32a3-babc-43ea-980b-19d64b06ddc8-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:14.125886 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.125849 2576 generic.go:358] "Generic (PLEG): container finished" podID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerID="c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd" exitCode=0 Apr 16 23:17:14.126074 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.125931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerDied","Data":"c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd"} Apr 16 23:17:14.126074 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.125945 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" Apr 16 23:17:14.126074 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.125967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l" event={"ID":"4e6b32a3-babc-43ea-980b-19d64b06ddc8","Type":"ContainerDied","Data":"36f5efd93b21ece04d536829efd0c3d94ee96426ba1ab76d21f77441e1cabbf4"} Apr 16 23:17:14.126074 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.125983 2576 scope.go:117] "RemoveContainer" containerID="f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0" Apr 16 23:17:14.134852 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.134836 2576 scope.go:117] "RemoveContainer" containerID="c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd" Apr 16 23:17:14.141888 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.141871 2576 scope.go:117] "RemoveContainer" containerID="687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2" Apr 16 23:17:14.147839 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.147811 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l"] Apr 16 23:17:14.149525 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.149504 2576 scope.go:117] "RemoveContainer" containerID="f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0" Apr 16 23:17:14.149824 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:17:14.149797 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0\": container with ID starting with f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0 not found: ID does not exist" containerID="f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0" Apr 16 23:17:14.149938 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.149830 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0"} err="failed to get container status \"f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0\": rpc error: code = NotFound desc = could not find container \"f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0\": container with ID starting with f1b1e433d8f74e5faee90e8bf1d9adfa1785764e92bfa2832423f7803da58ac0 not found: ID does not exist" Apr 16 23:17:14.149938 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.149849 2576 scope.go:117] "RemoveContainer" containerID="c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd" Apr 16 23:17:14.150131 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:17:14.150113 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd\": container with ID starting with c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd not found: ID does not exist" containerID="c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd" Apr 16 23:17:14.150209 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.150140 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd"} err="failed to get container status \"c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd\": rpc error: code = NotFound desc = could not find container \"c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd\": container with ID starting with c726bea391701a68c05a3d0e67f34e7435c6325aaf3bc608bad152f9951ef5cd not found: ID does not exist" Apr 16 23:17:14.150209 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.150162 2576 scope.go:117] "RemoveContainer" containerID="687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2" Apr 16 23:17:14.150675 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:17:14.150445 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2\": container with ID starting with 687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2 not found: ID does not exist" containerID="687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2" Apr 16 23:17:14.150675 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.150467 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2"} err="failed to get container status \"687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2\": rpc error: code = NotFound desc = could not find container \"687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2\": container with ID starting with 687e2220ff5c85f71663c3bc1759d4de647bd39477af9888b1c13ceb0e9ac2d2 not found: ID does not exist" Apr 16 23:17:14.150811 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:14.150773 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-nmm2l"] Apr 16 23:17:15.130640 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:15.130616 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj_203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e/storage-initializer/0.log" Apr 16 23:17:15.131120 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:15.130650 2576 generic.go:358] "Generic (PLEG): container finished" podID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerID="ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb" exitCode=1 Apr 16 23:17:15.131120 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:15.130687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" event={"ID":"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e","Type":"ContainerDied","Data":"ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb"} Apr 16 23:17:15.453832 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:15.453795 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" path="/var/lib/kubelet/pods/4e6b32a3-babc-43ea-980b-19d64b06ddc8/volumes" Apr 16 23:17:16.135186 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:16.135159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj_203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e/storage-initializer/0.log" Apr 16 23:17:16.135617 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:16.135266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" event={"ID":"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e","Type":"ContainerStarted","Data":"2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d"} Apr 16 23:17:20.398511 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:20.398480 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj"] Apr 16 23:17:20.398920 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:20.398812 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerName="storage-initializer" containerID="cri-o://2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d" gracePeriod=30 Apr 16 23:17:21.346594 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.346573 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj_203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e/storage-initializer/1.log" Apr 16 23:17:21.347018 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.346986 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj_203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e/storage-initializer/0.log" Apr 16 23:17:21.347159 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.347064 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:21.406177 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.406146 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-proxy-tls\") pod \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " Apr 16 23:17:21.406177 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.406180 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9k2n\" (UniqueName: \"kubernetes.io/projected/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kube-api-access-h9k2n\") pod \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " Apr 16 23:17:21.406702 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.406204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " Apr 16 23:17:21.406702 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.406235 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kserve-provision-location\") pod \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\" (UID: \"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e\") " Apr 16 23:17:21.406702 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.406558 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" (UID: "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:17:21.406702 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.406585 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" (UID: "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:17:21.408627 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.408590 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kube-api-access-h9k2n" (OuterVolumeSpecName: "kube-api-access-h9k2n") pod "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" (UID: "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e"). InnerVolumeSpecName "kube-api-access-h9k2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:17:21.408755 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.408733 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" (UID: "203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:17:21.475457 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475428 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp"] Apr 16 23:17:21.475738 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475726 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kube-rbac-proxy" Apr 16 23:17:21.475785 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475740 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kube-rbac-proxy" Apr 16 23:17:21.475785 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475754 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerName="storage-initializer" Apr 16 23:17:21.475785 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475761 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerName="storage-initializer" Apr 16 23:17:21.475785 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475769 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" Apr 16 23:17:21.475785 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475775 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" Apr 16 23:17:21.475785 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475784 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="storage-initializer" Apr 16 23:17:21.475976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475790 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="storage-initializer" Apr 16 23:17:21.475976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475840 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kserve-container" Apr 16 23:17:21.475976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475849 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerName="storage-initializer" Apr 16 23:17:21.475976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475855 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e6b32a3-babc-43ea-980b-19d64b06ddc8" containerName="kube-rbac-proxy" Apr 16 23:17:21.475976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475906 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerName="storage-initializer" Apr 16 23:17:21.475976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475912 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerName="storage-initializer" Apr 16 23:17:21.475976 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.475956 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerName="storage-initializer" Apr 16 23:17:21.479149 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.479133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.481548 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.481523 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 16 23:17:21.481675 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.481627 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 23:17:21.481675 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.481649 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 16 23:17:21.489377 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.489354 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp"] Apr 16 23:17:21.506752 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.506905 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.506905 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ae94d07-bfab-4930-96d1-42b964fa4e5e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.506905 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.506905 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjhk\" (UniqueName: \"kubernetes.io/projected/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kube-api-access-ssjhk\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.507145 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506955 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:21.507145 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506970 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:21.507145 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506982 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9k2n\" (UniqueName: \"kubernetes.io/projected/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-kube-api-access-h9k2n\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:21.507145 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.506997 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:17:21.608146 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.608146 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjhk\" (UniqueName: \"kubernetes.io/projected/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kube-api-access-ssjhk\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.608431 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.608431 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.608431 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ae94d07-bfab-4930-96d1-42b964fa4e5e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.608712 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.608867 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.608937 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.608918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.610644 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.610621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ae94d07-bfab-4930-96d1-42b964fa4e5e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.615839 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.615821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjhk\" (UniqueName: \"kubernetes.io/projected/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kube-api-access-ssjhk\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.790236 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.790132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:21.910949 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:21.910915 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp"] Apr 16 23:17:21.914528 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:17:21.914496 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae94d07_bfab_4930_96d1_42b964fa4e5e.slice/crio-478a965cab7ca6c2f095094471124fdc5bc7c1425d8b75698fe872232a4753a0 WatchSource:0}: Error finding container 478a965cab7ca6c2f095094471124fdc5bc7c1425d8b75698fe872232a4753a0: Status 404 returned error can't find the container with id 478a965cab7ca6c2f095094471124fdc5bc7c1425d8b75698fe872232a4753a0 Apr 16 23:17:22.156699 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.156608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj_203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e/storage-initializer/1.log" Apr 16 23:17:22.156975 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.156955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj_203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e/storage-initializer/0.log" Apr 16 23:17:22.157082 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.156991 2576 generic.go:358] "Generic (PLEG): container finished" podID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" containerID="2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d" exitCode=1 Apr 16 23:17:22.157082 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.157026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" event={"ID":"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e","Type":"ContainerDied","Data":"2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d"} Apr 16 23:17:22.157082 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.157062 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" event={"ID":"203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e","Type":"ContainerDied","Data":"afa1736eefb122d09470dc74dd4476d1eb289e0c48cc95524e2ffc6841018b77"} Apr 16 23:17:22.157082 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.157082 2576 scope.go:117] "RemoveContainer" containerID="2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d" Apr 16 23:17:22.157310 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.157096 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj" Apr 16 23:17:22.158826 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.158799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerStarted","Data":"de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0"} Apr 16 23:17:22.158951 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.158834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerStarted","Data":"478a965cab7ca6c2f095094471124fdc5bc7c1425d8b75698fe872232a4753a0"} Apr 16 23:17:22.164953 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.164936 2576 scope.go:117] "RemoveContainer" containerID="ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb" Apr 16 23:17:22.171732 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.171714 2576 scope.go:117] "RemoveContainer" containerID="2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d" Apr 16 23:17:22.171983 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:17:22.171964 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d\": container with ID starting with 2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d not found: ID does not exist" containerID="2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d" Apr 16 23:17:22.172068 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.171990 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d"} err="failed to get container status \"2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d\": rpc error: code = NotFound desc = could not find container \"2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d\": container with ID starting with 2c0b23500bcc95a9c484642e89a20a4f71c163b183f5c38ab9d40b5b10ce3d0d not found: ID does not exist" Apr 16 23:17:22.172068 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.172007 2576 scope.go:117] "RemoveContainer" containerID="ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb" Apr 16 23:17:22.172256 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:17:22.172237 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb\": container with ID starting with ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb not found: ID does not exist" containerID="ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb" Apr 16 23:17:22.172318 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.172261 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb"} err="failed to get container status \"ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb\": rpc error: code = NotFound desc = could not find container \"ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb\": container with ID starting with ca8479cac52ae1e59407fee603ae3099d7ce4ea57b3579155607fdd56e6b49bb not found: ID does not exist" Apr 16 23:17:22.189507 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.189473 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj"] Apr 16 23:17:22.195420 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:22.195390 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-pqvbj"] Apr 16 23:17:23.164622 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:23.164530 2576 generic.go:358] "Generic (PLEG): container finished" podID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerID="de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0" exitCode=0 Apr 16 23:17:23.164622 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:23.164579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerDied","Data":"de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0"} Apr 16 23:17:23.454237 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:23.454202 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e" path="/var/lib/kubelet/pods/203c61ab-db79-4dd4-b2a9-07a3ef6dfe4e/volumes" Apr 16 23:17:24.169896 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:24.169860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerStarted","Data":"5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177"} Apr 16 23:17:24.169896 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:24.169900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerStarted","Data":"9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0"} Apr 16 23:17:24.170367 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:24.170117 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:24.170367 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:24.170245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:24.171562 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:24.171536 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:17:24.187118 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:24.187074 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podStartSLOduration=3.187061891 podStartE2EDuration="3.187061891s" podCreationTimestamp="2026-04-16 23:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:17:24.186199032 +0000 UTC m=+3823.310823355" watchObservedRunningTime="2026-04-16 23:17:24.187061891 +0000 UTC m=+3823.311686175" Apr 16 23:17:25.173654 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:25.173615 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:17:30.178228 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:30.178201 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:17:30.178698 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:30.178673 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:17:40.179562 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:40.179522 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:17:50.178957 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:17:50.178913 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:18:00.179288 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:00.179193 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:18:10.179418 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:10.179379 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:18:20.179376 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:20.179316 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 23:18:30.180191 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:30.180158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:18:31.500656 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:31.500622 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp"] Apr 16 23:18:31.501125 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:31.500899 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" containerID="cri-o://9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0" gracePeriod=30 Apr 16 23:18:31.501125 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:31.500930 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kube-rbac-proxy" containerID="cri-o://5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177" gracePeriod=30 Apr 16 23:18:32.377361 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.377307 2576 generic.go:358] "Generic (PLEG): container finished" podID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerID="5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177" exitCode=2 Apr 16 23:18:32.377554 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.377385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerDied","Data":"5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177"} Apr 16 23:18:32.564639 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.564592 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr"] Apr 16 23:18:32.568110 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.568088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.570780 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.570745 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 16 23:18:32.570917 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.570744 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 16 23:18:32.575956 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.575933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr"] Apr 16 23:18:32.664293 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.664193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd59f530-3bb1-4403-8210-4c10b5f3e19e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.664293 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.664239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.664293 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.664267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kqtf\" (UniqueName: \"kubernetes.io/projected/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kube-api-access-8kqtf\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.664569 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.664384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd59f530-3bb1-4403-8210-4c10b5f3e19e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.765414 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.765373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd59f530-3bb1-4403-8210-4c10b5f3e19e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.765606 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.765496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd59f530-3bb1-4403-8210-4c10b5f3e19e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.765606 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.765529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.765606 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.765559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kqtf\" (UniqueName: \"kubernetes.io/projected/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kube-api-access-8kqtf\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.765895 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.765869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.766125 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.766107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd59f530-3bb1-4403-8210-4c10b5f3e19e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.767940 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.767916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd59f530-3bb1-4403-8210-4c10b5f3e19e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.773421 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.773389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kqtf\" (UniqueName: \"kubernetes.io/projected/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kube-api-access-8kqtf\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:32.880387 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:32.880352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:33.009524 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:33.009489 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr"] Apr 16 23:18:33.012030 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:18:33.011997 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd59f530_3bb1_4403_8210_4c10b5f3e19e.slice/crio-a78ddcaa044046649a9b8f6d4ac254245b5b46e936e810b603e886a76089e4c9 WatchSource:0}: Error finding container a78ddcaa044046649a9b8f6d4ac254245b5b46e936e810b603e886a76089e4c9: Status 404 returned error can't find the container with id a78ddcaa044046649a9b8f6d4ac254245b5b46e936e810b603e886a76089e4c9 Apr 16 23:18:33.381424 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:33.381385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" event={"ID":"bd59f530-3bb1-4403-8210-4c10b5f3e19e","Type":"ContainerStarted","Data":"ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02"} Apr 16 23:18:33.381601 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:33.381431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" event={"ID":"bd59f530-3bb1-4403-8210-4c10b5f3e19e","Type":"ContainerStarted","Data":"a78ddcaa044046649a9b8f6d4ac254245b5b46e936e810b603e886a76089e4c9"} Apr 16 23:18:35.174128 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:35.174063 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.73:8643/healthz\": dial tcp 10.133.0.73:8643: connect: connection refused" Apr 16 23:18:36.045468 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.045425 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:18:36.194312 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194283 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-cabundle-cert\") pod \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " Apr 16 23:18:36.194768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194339 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kserve-provision-location\") pod \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " Apr 16 23:18:36.194768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194387 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " Apr 16 23:18:36.194768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194413 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ae94d07-bfab-4930-96d1-42b964fa4e5e-proxy-tls\") pod \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " Apr 16 23:18:36.194768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194439 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssjhk\" (UniqueName: \"kubernetes.io/projected/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kube-api-access-ssjhk\") pod \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\" (UID: \"4ae94d07-bfab-4930-96d1-42b964fa4e5e\") " Apr 16 23:18:36.194768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194737 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ae94d07-bfab-4930-96d1-42b964fa4e5e" (UID: "4ae94d07-bfab-4930-96d1-42b964fa4e5e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:18:36.194768 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194760 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4ae94d07-bfab-4930-96d1-42b964fa4e5e" (UID: "4ae94d07-bfab-4930-96d1-42b964fa4e5e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:18:36.195042 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.194856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "4ae94d07-bfab-4930-96d1-42b964fa4e5e" (UID: "4ae94d07-bfab-4930-96d1-42b964fa4e5e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:18:36.196618 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.196593 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae94d07-bfab-4930-96d1-42b964fa4e5e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ae94d07-bfab-4930-96d1-42b964fa4e5e" (UID: "4ae94d07-bfab-4930-96d1-42b964fa4e5e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:18:36.196692 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.196602 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kube-api-access-ssjhk" (OuterVolumeSpecName: "kube-api-access-ssjhk") pod "4ae94d07-bfab-4930-96d1-42b964fa4e5e" (UID: "4ae94d07-bfab-4930-96d1-42b964fa4e5e"). InnerVolumeSpecName "kube-api-access-ssjhk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:18:36.295345 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.295298 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ae94d07-bfab-4930-96d1-42b964fa4e5e-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:36.295521 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.295353 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ssjhk\" (UniqueName: \"kubernetes.io/projected/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kube-api-access-ssjhk\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:36.295521 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.295370 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-cabundle-cert\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:36.295521 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.295386 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae94d07-bfab-4930-96d1-42b964fa4e5e-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:36.295521 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.295396 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ae94d07-bfab-4930-96d1-42b964fa4e5e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:36.398268 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.398235 2576 generic.go:358] "Generic (PLEG): container finished" podID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerID="9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0" exitCode=0 Apr 16 23:18:36.398453 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.398319 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" Apr 16 23:18:36.398453 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.398341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerDied","Data":"9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0"} Apr 16 23:18:36.398453 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.398379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp" event={"ID":"4ae94d07-bfab-4930-96d1-42b964fa4e5e","Type":"ContainerDied","Data":"478a965cab7ca6c2f095094471124fdc5bc7c1425d8b75698fe872232a4753a0"} Apr 16 23:18:36.398453 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.398396 2576 scope.go:117] "RemoveContainer" containerID="5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177" Apr 16 23:18:36.406073 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.406047 2576 scope.go:117] "RemoveContainer" containerID="9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0" Apr 16 23:18:36.413811 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.413791 2576 scope.go:117] "RemoveContainer" containerID="de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0" Apr 16 23:18:36.420493 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.420475 2576 scope.go:117] "RemoveContainer" containerID="5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177" Apr 16 23:18:36.420739 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:18:36.420720 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177\": container with ID starting with 5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177 not found: ID does not exist" containerID="5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177" Apr 16 23:18:36.420800 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.420748 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177"} err="failed to get container status \"5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177\": rpc error: code = NotFound desc = could not find container \"5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177\": container with ID starting with 5ad4dce34a1efaef230a624c6993581a0c2ac1f8d8b55beb146bcf8145288177 not found: ID does not exist" Apr 16 23:18:36.420800 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.420767 2576 scope.go:117] "RemoveContainer" containerID="9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0" Apr 16 23:18:36.421006 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:18:36.420990 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0\": container with ID starting with 9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0 not found: ID does not exist" containerID="9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0" Apr 16 23:18:36.421052 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.421015 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0"} err="failed to get container status \"9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0\": rpc error: code = NotFound desc = could not find container \"9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0\": container with ID starting with 9564fc8ae0165cab1c1729068e393a6765517853d9c3b84d3a83a5b5306527a0 not found: ID does not exist" Apr 16 23:18:36.421052 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.421031 2576 scope.go:117] "RemoveContainer" containerID="de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0" Apr 16 23:18:36.421289 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:18:36.421264 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0\": container with ID starting with de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0 not found: ID does not exist" containerID="de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0" Apr 16 23:18:36.421505 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.421293 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0"} err="failed to get container status \"de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0\": rpc error: code = NotFound desc = could not find container \"de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0\": container with ID starting with de4ba7b34b517abea31088c6554ec6643d3f0c6716d4fa3004d1b51b17a5e2c0 not found: ID does not exist" Apr 16 23:18:36.423254 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.423233 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp"] Apr 16 23:18:36.427314 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:36.427293 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-n2hcp"] Apr 16 23:18:37.453128 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:37.453098 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" path="/var/lib/kubelet/pods/4ae94d07-bfab-4930-96d1-42b964fa4e5e/volumes" Apr 16 23:18:39.408708 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:39.408680 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/0.log" Apr 16 23:18:39.409191 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:39.408717 2576 generic.go:358] "Generic (PLEG): container finished" podID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerID="ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02" exitCode=1 Apr 16 23:18:39.409191 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:39.408806 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" event={"ID":"bd59f530-3bb1-4403-8210-4c10b5f3e19e","Type":"ContainerDied","Data":"ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02"} Apr 16 23:18:40.413568 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:40.413541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/0.log" Apr 16 23:18:40.413935 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:40.413640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" event={"ID":"bd59f530-3bb1-4403-8210-4c10b5f3e19e","Type":"ContainerStarted","Data":"ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef"} Apr 16 23:18:41.567795 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:41.567751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/0.log" Apr 16 23:18:41.575953 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:41.575922 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/0.log" Apr 16 23:18:42.547688 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:42.547654 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr"] Apr 16 23:18:42.548019 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:42.547991 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerName="storage-initializer" containerID="cri-o://ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef" gracePeriod=30 Apr 16 23:18:44.114721 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.114695 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/1.log" Apr 16 23:18:44.115090 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.115073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/0.log" Apr 16 23:18:44.115148 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.115138 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:44.153353 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.153241 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kserve-provision-location\") pod \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " Apr 16 23:18:44.153353 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.153282 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kqtf\" (UniqueName: \"kubernetes.io/projected/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kube-api-access-8kqtf\") pod \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " Apr 16 23:18:44.153353 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.153302 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd59f530-3bb1-4403-8210-4c10b5f3e19e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " Apr 16 23:18:44.153353 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.153338 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd59f530-3bb1-4403-8210-4c10b5f3e19e-proxy-tls\") pod \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\" (UID: \"bd59f530-3bb1-4403-8210-4c10b5f3e19e\") " Apr 16 23:18:44.153676 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.153580 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd59f530-3bb1-4403-8210-4c10b5f3e19e" (UID: "bd59f530-3bb1-4403-8210-4c10b5f3e19e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:18:44.153676 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.153651 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd59f530-3bb1-4403-8210-4c10b5f3e19e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "bd59f530-3bb1-4403-8210-4c10b5f3e19e" (UID: "bd59f530-3bb1-4403-8210-4c10b5f3e19e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:18:44.155390 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.155369 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd59f530-3bb1-4403-8210-4c10b5f3e19e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bd59f530-3bb1-4403-8210-4c10b5f3e19e" (UID: "bd59f530-3bb1-4403-8210-4c10b5f3e19e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:18:44.155562 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.155543 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kube-api-access-8kqtf" (OuterVolumeSpecName: "kube-api-access-8kqtf") pod "bd59f530-3bb1-4403-8210-4c10b5f3e19e" (UID: "bd59f530-3bb1-4403-8210-4c10b5f3e19e"). InnerVolumeSpecName "kube-api-access-8kqtf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:18:44.253959 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.253928 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kserve-provision-location\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:44.253959 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.253956 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kqtf\" (UniqueName: \"kubernetes.io/projected/bd59f530-3bb1-4403-8210-4c10b5f3e19e-kube-api-access-8kqtf\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:44.253959 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.253966 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bd59f530-3bb1-4403-8210-4c10b5f3e19e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:44.254206 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.253976 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd59f530-3bb1-4403-8210-4c10b5f3e19e-proxy-tls\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:18:44.427771 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.427685 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/1.log" Apr 16 23:18:44.428045 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.428031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr_bd59f530-3bb1-4403-8210-4c10b5f3e19e/storage-initializer/0.log" Apr 16 23:18:44.428101 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.428070 2576 generic.go:358] "Generic (PLEG): container finished" podID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerID="ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef" exitCode=1 Apr 16 23:18:44.428153 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.428119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" event={"ID":"bd59f530-3bb1-4403-8210-4c10b5f3e19e","Type":"ContainerDied","Data":"ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef"} Apr 16 23:18:44.428193 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.428153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" event={"ID":"bd59f530-3bb1-4403-8210-4c10b5f3e19e","Type":"ContainerDied","Data":"a78ddcaa044046649a9b8f6d4ac254245b5b46e936e810b603e886a76089e4c9"} Apr 16 23:18:44.428193 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.428160 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr" Apr 16 23:18:44.428193 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.428173 2576 scope.go:117] "RemoveContainer" containerID="ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef" Apr 16 23:18:44.436281 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.436263 2576 scope.go:117] "RemoveContainer" containerID="ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02" Apr 16 23:18:44.443221 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.443205 2576 scope.go:117] "RemoveContainer" containerID="ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef" Apr 16 23:18:44.443549 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:18:44.443529 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef\": container with ID starting with ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef not found: ID does not exist" containerID="ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef" Apr 16 23:18:44.443595 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.443559 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef"} err="failed to get container status \"ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef\": rpc error: code = NotFound desc = could not find container \"ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef\": container with ID starting with ccd5bef5f3ae98fa0072f0360a12e600436b6aad1cd40dc4096dfcda1b6b09ef not found: ID does not exist" Apr 16 23:18:44.443595 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.443578 2576 scope.go:117] "RemoveContainer" containerID="ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02" Apr 16 23:18:44.443815 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:18:44.443792 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02\": container with ID starting with ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02 not found: ID does not exist" containerID="ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02" Apr 16 23:18:44.443861 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.443822 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02"} err="failed to get container status \"ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02\": rpc error: code = NotFound desc = could not find container \"ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02\": container with ID starting with ff3a99f21feb396e280e2e484892281f34bc1e2089d4f7803bc1239a4e1ede02 not found: ID does not exist" Apr 16 23:18:44.464704 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.463044 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr"] Apr 16 23:18:44.467712 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.467661 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-8r5vr"] Apr 16 23:18:44.743386 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743337 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rl447/must-gather-ntwq4"] Apr 16 23:18:44.743739 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743720 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerName="storage-initializer" Apr 16 23:18:44.743824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743742 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerName="storage-initializer" Apr 16 23:18:44.743824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743761 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" Apr 16 23:18:44.743824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743771 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" Apr 16 23:18:44.743824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743783 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kube-rbac-proxy" Apr 16 23:18:44.743824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743791 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kube-rbac-proxy" Apr 16 23:18:44.743824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743800 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerName="storage-initializer" Apr 16 23:18:44.743824 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743809 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerName="storage-initializer" Apr 16 23:18:44.744173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743827 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="storage-initializer" Apr 16 23:18:44.744173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743836 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="storage-initializer" Apr 16 23:18:44.744173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743910 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerName="storage-initializer" Apr 16 23:18:44.744173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743925 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kube-rbac-proxy" Apr 16 23:18:44.744173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.743937 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ae94d07-bfab-4930-96d1-42b964fa4e5e" containerName="kserve-container" Apr 16 23:18:44.744173 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.744074 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" containerName="storage-initializer" Apr 16 23:18:44.748471 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.748446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:44.750997 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.750969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rl447\"/\"openshift-service-ca.crt\"" Apr 16 23:18:44.751115 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.750999 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rl447\"/\"kube-root-ca.crt\"" Apr 16 23:18:44.753512 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.753483 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rl447/must-gather-ntwq4"] Apr 16 23:18:44.756773 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.756752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdbrc\" (UniqueName: \"kubernetes.io/projected/a2c27ea4-16b1-4c24-af22-cabe43cf3754-kube-api-access-fdbrc\") pod \"must-gather-ntwq4\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:44.756880 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.756821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2c27ea4-16b1-4c24-af22-cabe43cf3754-must-gather-output\") pod \"must-gather-ntwq4\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:44.857582 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.857549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2c27ea4-16b1-4c24-af22-cabe43cf3754-must-gather-output\") pod \"must-gather-ntwq4\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:44.857771 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.857625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdbrc\" (UniqueName: \"kubernetes.io/projected/a2c27ea4-16b1-4c24-af22-cabe43cf3754-kube-api-access-fdbrc\") pod \"must-gather-ntwq4\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:44.857941 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.857921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2c27ea4-16b1-4c24-af22-cabe43cf3754-must-gather-output\") pod \"must-gather-ntwq4\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:44.866009 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:44.865979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdbrc\" (UniqueName: \"kubernetes.io/projected/a2c27ea4-16b1-4c24-af22-cabe43cf3754-kube-api-access-fdbrc\") pod \"must-gather-ntwq4\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:45.070364 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:45.070275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:18:45.187048 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:45.186925 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rl447/must-gather-ntwq4"] Apr 16 23:18:45.189739 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:18:45.189714 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2c27ea4_16b1_4c24_af22_cabe43cf3754.slice/crio-b0f3c359bb87ba15e5c8029b0f7aa1f786f26eb0d4f9b0881f8964221d0985dd WatchSource:0}: Error finding container b0f3c359bb87ba15e5c8029b0f7aa1f786f26eb0d4f9b0881f8964221d0985dd: Status 404 returned error can't find the container with id b0f3c359bb87ba15e5c8029b0f7aa1f786f26eb0d4f9b0881f8964221d0985dd Apr 16 23:18:45.433749 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:45.433655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rl447/must-gather-ntwq4" event={"ID":"a2c27ea4-16b1-4c24-af22-cabe43cf3754","Type":"ContainerStarted","Data":"b0f3c359bb87ba15e5c8029b0f7aa1f786f26eb0d4f9b0881f8964221d0985dd"} Apr 16 23:18:45.453111 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:45.453080 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd59f530-3bb1-4403-8210-4c10b5f3e19e" path="/var/lib/kubelet/pods/bd59f530-3bb1-4403-8210-4c10b5f3e19e/volumes" Apr 16 23:18:49.452839 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:49.452802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rl447/must-gather-ntwq4" event={"ID":"a2c27ea4-16b1-4c24-af22-cabe43cf3754","Type":"ContainerStarted","Data":"0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472"} Apr 16 23:18:49.452839 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:49.452840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rl447/must-gather-ntwq4" event={"ID":"a2c27ea4-16b1-4c24-af22-cabe43cf3754","Type":"ContainerStarted","Data":"acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1"} Apr 16 23:18:49.467814 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:18:49.467764 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rl447/must-gather-ntwq4" podStartSLOduration=1.511605963 podStartE2EDuration="5.467748074s" podCreationTimestamp="2026-04-16 23:18:44 +0000 UTC" firstStartedPulling="2026-04-16 23:18:45.191304511 +0000 UTC m=+3904.315928776" lastFinishedPulling="2026-04-16 23:18:49.14744661 +0000 UTC m=+3908.272070887" observedRunningTime="2026-04-16 23:18:49.465892468 +0000 UTC m=+3908.590516755" watchObservedRunningTime="2026-04-16 23:18:49.467748074 +0000 UTC m=+3908.592372357" Apr 16 23:19:11.527998 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:11.527962 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerID="acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1" exitCode=0 Apr 16 23:19:11.528447 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:11.528018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rl447/must-gather-ntwq4" event={"ID":"a2c27ea4-16b1-4c24-af22-cabe43cf3754","Type":"ContainerDied","Data":"acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1"} Apr 16 23:19:11.528447 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:11.528315 2576 scope.go:117] "RemoveContainer" containerID="acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1" Apr 16 23:19:11.712292 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:11.712258 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rl447_must-gather-ntwq4_a2c27ea4-16b1-4c24-af22-cabe43cf3754/gather/0.log" Apr 16 23:19:15.083611 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:15.083574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kzvxc_c77a6141-b229-4002-8fb2-722d0a6093bb/global-pull-secret-syncer/0.log" Apr 16 23:19:15.224309 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:15.224276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lsjm9_887606c2-6d2b-4ac2-820a-f012e1cb7100/konnectivity-agent/0.log" Apr 16 23:19:15.297635 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:15.297607 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-183.ec2.internal_91f722200116ede2db7277922fd7931a/haproxy/0.log" Apr 16 23:19:17.229533 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.229500 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rl447/must-gather-ntwq4"] Apr 16 23:19:17.230903 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.229778 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rl447/must-gather-ntwq4" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerName="copy" containerID="cri-o://0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472" gracePeriod=2 Apr 16 23:19:17.232497 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.232459 2576 status_manager.go:895] "Failed to get status for pod" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" pod="openshift-must-gather-rl447/must-gather-ntwq4" err="pods \"must-gather-ntwq4\" is forbidden: User \"system:node:ip-10-0-133-183.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rl447\": no relationship found between node 'ip-10-0-133-183.ec2.internal' and this object" Apr 16 23:19:17.233990 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.233381 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rl447/must-gather-ntwq4"] Apr 16 23:19:17.463969 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.463948 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rl447_must-gather-ntwq4_a2c27ea4-16b1-4c24-af22-cabe43cf3754/copy/0.log" Apr 16 23:19:17.464265 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.464250 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:19:17.533745 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.533665 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdbrc\" (UniqueName: \"kubernetes.io/projected/a2c27ea4-16b1-4c24-af22-cabe43cf3754-kube-api-access-fdbrc\") pod \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " Apr 16 23:19:17.533745 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.533708 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2c27ea4-16b1-4c24-af22-cabe43cf3754-must-gather-output\") pod \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\" (UID: \"a2c27ea4-16b1-4c24-af22-cabe43cf3754\") " Apr 16 23:19:17.535288 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.535257 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c27ea4-16b1-4c24-af22-cabe43cf3754-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a2c27ea4-16b1-4c24-af22-cabe43cf3754" (UID: "a2c27ea4-16b1-4c24-af22-cabe43cf3754"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:19:17.535906 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.535884 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c27ea4-16b1-4c24-af22-cabe43cf3754-kube-api-access-fdbrc" (OuterVolumeSpecName: "kube-api-access-fdbrc") pod "a2c27ea4-16b1-4c24-af22-cabe43cf3754" (UID: "a2c27ea4-16b1-4c24-af22-cabe43cf3754"). InnerVolumeSpecName "kube-api-access-fdbrc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:19:17.545939 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.545921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rl447_must-gather-ntwq4_a2c27ea4-16b1-4c24-af22-cabe43cf3754/copy/0.log" Apr 16 23:19:17.546199 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.546176 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerID="0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472" exitCode=143 Apr 16 23:19:17.546292 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.546219 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rl447/must-gather-ntwq4" Apr 16 23:19:17.546292 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.546256 2576 scope.go:117] "RemoveContainer" containerID="0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472" Apr 16 23:19:17.553398 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.553208 2576 scope.go:117] "RemoveContainer" containerID="acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1" Apr 16 23:19:17.565756 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.565734 2576 scope.go:117] "RemoveContainer" containerID="0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472" Apr 16 23:19:17.566010 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:19:17.565987 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472\": container with ID starting with 0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472 not found: ID does not exist" containerID="0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472" Apr 16 23:19:17.566057 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.566021 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472"} err="failed to get container status \"0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472\": rpc error: code = NotFound desc = could not find container \"0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472\": container with ID starting with 0f7b592c2ba13973883da3e9e16bf47d68a417bb58c794e21432834680147472 not found: ID does not exist" Apr 16 23:19:17.566057 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.566039 2576 scope.go:117] "RemoveContainer" containerID="acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1" Apr 16 23:19:17.566254 ip-10-0-133-183 kubenswrapper[2576]: E0416 23:19:17.566234 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1\": container with ID starting with acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1 not found: ID does not exist" containerID="acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1" Apr 16 23:19:17.566295 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.566261 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1"} err="failed to get container status \"acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1\": rpc error: code = NotFound desc = could not find container \"acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1\": container with ID starting with acb3fc130c70e01d63213920f76d8843094317bce24565657cb3c23550f5ede1 not found: ID does not exist" Apr 16 23:19:17.634663 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.634626 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdbrc\" (UniqueName: \"kubernetes.io/projected/a2c27ea4-16b1-4c24-af22-cabe43cf3754-kube-api-access-fdbrc\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:19:17.634663 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:17.634658 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2c27ea4-16b1-4c24-af22-cabe43cf3754-must-gather-output\") on node \"ip-10-0-133-183.ec2.internal\" DevicePath \"\"" Apr 16 23:19:19.453572 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:19.453541 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" path="/var/lib/kubelet/pods/a2c27ea4-16b1-4c24-af22-cabe43cf3754/volumes" Apr 16 23:19:19.535135 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:19.535105 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-774dh_7fa9245e-fc11-4e0b-9c31-017700afca37/node-exporter/0.log" Apr 16 23:19:19.566387 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:19.566353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-774dh_7fa9245e-fc11-4e0b-9c31-017700afca37/kube-rbac-proxy/0.log" Apr 16 23:19:19.601208 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:19.601181 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-774dh_7fa9245e-fc11-4e0b-9c31-017700afca37/init-textfile/0.log" Apr 16 23:19:20.023498 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:20.023459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-s2ssf_5f4172c2-c04a-49df-b80d-bc8d1949aaca/prometheus-operator/0.log" Apr 16 23:19:20.046394 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:20.046352 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-s2ssf_5f4172c2-c04a-49df-b80d-bc8d1949aaca/kube-rbac-proxy/0.log" Apr 16 23:19:21.955559 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.955522 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh"] Apr 16 23:19:21.955964 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.955854 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerName="gather" Apr 16 23:19:21.955964 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.955866 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerName="gather" Apr 16 23:19:21.955964 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.955878 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerName="copy" Apr 16 23:19:21.955964 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.955883 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerName="copy" Apr 16 23:19:21.955964 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.955955 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerName="gather" Apr 16 23:19:21.955964 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.955967 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2c27ea4-16b1-4c24-af22-cabe43cf3754" containerName="copy" Apr 16 23:19:21.959757 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.959738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:21.962183 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.962156 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8ccpf\"/\"default-dockercfg-hkh9t\"" Apr 16 23:19:21.962287 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.962157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8ccpf\"/\"openshift-service-ca.crt\"" Apr 16 23:19:21.962287 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.962194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8ccpf\"/\"kube-root-ca.crt\"" Apr 16 23:19:21.965274 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:21.965253 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh"] Apr 16 23:19:22.072550 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.072510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-podres\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.072550 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.072548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwk2w\" (UniqueName: \"kubernetes.io/projected/cb7aad6a-c578-44ba-8cfd-14396359d7a6-kube-api-access-gwk2w\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.072786 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.072628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-sys\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.072786 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.072674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-proc\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.072786 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.072690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-lib-modules\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.173934 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.173884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-sys\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.173934 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.173937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-proc\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.174164 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.173954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-lib-modules\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.174164 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.174000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-podres\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.174164 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.174017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwk2w\" (UniqueName: \"kubernetes.io/projected/cb7aad6a-c578-44ba-8cfd-14396359d7a6-kube-api-access-gwk2w\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.174164 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.174019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-proc\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.174164 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.174019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-sys\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.174164 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.174126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-lib-modules\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.174164 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.174132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb7aad6a-c578-44ba-8cfd-14396359d7a6-podres\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.181743 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.181721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwk2w\" (UniqueName: \"kubernetes.io/projected/cb7aad6a-c578-44ba-8cfd-14396359d7a6-kube-api-access-gwk2w\") pod \"perf-node-gather-daemonset-x6mdh\" (UID: \"cb7aad6a-c578-44ba-8cfd-14396359d7a6\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.270996 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.270898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.390220 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.390191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh"] Apr 16 23:19:22.393148 ip-10-0-133-183 kubenswrapper[2576]: W0416 23:19:22.393114 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcb7aad6a_c578_44ba_8cfd_14396359d7a6.slice/crio-c9532792cc0c04a2630c08c307bc44b9880c0fc05e2e8b384cb308a9863431d7 WatchSource:0}: Error finding container c9532792cc0c04a2630c08c307bc44b9880c0fc05e2e8b384cb308a9863431d7: Status 404 returned error can't find the container with id c9532792cc0c04a2630c08c307bc44b9880c0fc05e2e8b384cb308a9863431d7 Apr 16 23:19:22.394648 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.394632 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:19:22.563704 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.563607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" event={"ID":"cb7aad6a-c578-44ba-8cfd-14396359d7a6","Type":"ContainerStarted","Data":"157d72d5f5ad0665b0b06061ad8ce94434a8f453470b526fa62c2a12ea4101ee"} Apr 16 23:19:22.563704 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.563642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" event={"ID":"cb7aad6a-c578-44ba-8cfd-14396359d7a6","Type":"ContainerStarted","Data":"c9532792cc0c04a2630c08c307bc44b9880c0fc05e2e8b384cb308a9863431d7"} Apr 16 23:19:22.563704 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.563666 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:22.578980 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:22.578911 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" podStartSLOduration=1.578896649 podStartE2EDuration="1.578896649s" podCreationTimestamp="2026-04-16 23:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:19:22.577770185 +0000 UTC m=+3941.702394473" watchObservedRunningTime="2026-04-16 23:19:22.578896649 +0000 UTC m=+3941.703520933" Apr 16 23:19:23.121492 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:23.121457 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sbt4r_c2ffb347-b866-40c3-b690-eb1a564849ef/dns/0.log" Apr 16 23:19:23.143042 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:23.143017 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sbt4r_c2ffb347-b866-40c3-b690-eb1a564849ef/kube-rbac-proxy/0.log" Apr 16 23:19:23.166661 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:23.166630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58875_7a33450b-5146-4096-a0eb-79767266b790/dns-node-resolver/0.log" Apr 16 23:19:23.653737 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:23.653703 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t27sn_534a804f-de9c-430e-9e0a-47849b4977da/node-ca/0.log" Apr 16 23:19:24.348927 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:24.348896 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ccb67d98b-l72zk_91f82d6f-53a5-432d-b3a4-03bdde2f00e5/router/0.log" Apr 16 23:19:24.653252 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:24.653171 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-898dq_3fede498-aa63-44a3-8ef2-e602f7ca7131/serve-healthcheck-canary/0.log" Apr 16 23:19:25.156719 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:25.156690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xwqzs_9b4e0565-78b4-4d03-a01b-ce6b39a81446/kube-rbac-proxy/0.log" Apr 16 23:19:25.175680 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:25.175655 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xwqzs_9b4e0565-78b4-4d03-a01b-ce6b39a81446/exporter/0.log" Apr 16 23:19:25.195641 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:25.195617 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xwqzs_9b4e0565-78b4-4d03-a01b-ce6b39a81446/extractor/0.log" Apr 16 23:19:27.085544 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:27.085509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84d7d5cfc6-2qdg6_375b0231-b7e5-418f-815e-42e8c0618990/manager/0.log" Apr 16 23:19:27.125560 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:27.125532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-28qb8_518c2439-1b1d-45e9-9af6-e80e3c47182d/server/0.log" Apr 16 23:19:27.455376 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:27.455352 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-wnmw4_521f722d-8bc2-4b4d-849a-9eda46c2bcf4/seaweedfs/0.log" Apr 16 23:19:27.476724 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:27.476694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-qzgvs_8c535426-ec52-40a4-8031-7068e4ce28d1/seaweedfs-tls-custom/0.log" Apr 16 23:19:27.497987 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:27.497958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-f57cn_be52c574-4b03-4c0b-816f-4b024c5aa785/seaweedfs-tls-serving/0.log" Apr 16 23:19:28.576242 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:28.576205 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-x6mdh" Apr 16 23:19:31.391222 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:31.391148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bx8dw_2bb3aae4-cca6-473a-963f-eaafa8efc8fa/migrator/0.log" Apr 16 23:19:31.414109 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:31.414084 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bx8dw_2bb3aae4-cca6-473a-963f-eaafa8efc8fa/graceful-termination/0.log" Apr 16 23:19:31.781970 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:31.781934 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-5ddtz_c83c75a9-9ffe-4644-b733-f725231b1b4b/kube-storage-version-migrator-operator/1.log" Apr 16 23:19:31.783185 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:31.783168 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-5ddtz_c83c75a9-9ffe-4644-b733-f725231b1b4b/kube-storage-version-migrator-operator/0.log" Apr 16 23:19:32.937205 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:32.937119 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6fh4_a0f43c4e-755a-43dd-96e1-ee4825dcce6e/kube-multus-additional-cni-plugins/0.log" Apr 16 23:19:32.961517 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:32.961489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6fh4_a0f43c4e-755a-43dd-96e1-ee4825dcce6e/egress-router-binary-copy/0.log" Apr 16 23:19:33.009708 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.009686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6fh4_a0f43c4e-755a-43dd-96e1-ee4825dcce6e/cni-plugins/0.log" Apr 16 23:19:33.032530 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.032502 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6fh4_a0f43c4e-755a-43dd-96e1-ee4825dcce6e/bond-cni-plugin/0.log" Apr 16 23:19:33.053980 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.053952 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6fh4_a0f43c4e-755a-43dd-96e1-ee4825dcce6e/routeoverride-cni/0.log" Apr 16 23:19:33.074483 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.074456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6fh4_a0f43c4e-755a-43dd-96e1-ee4825dcce6e/whereabouts-cni-bincopy/0.log" Apr 16 23:19:33.094555 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.094528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p6fh4_a0f43c4e-755a-43dd-96e1-ee4825dcce6e/whereabouts-cni/0.log" Apr 16 23:19:33.153640 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.153615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxsfw_53a1b81a-7c98-464a-b673-d8b7022f892d/kube-multus/0.log" Apr 16 23:19:33.175033 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.174999 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6dklm_c3eaddc5-e6c1-45aa-a952-0c7d74359e05/network-metrics-daemon/0.log" Apr 16 23:19:33.198659 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:33.198630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6dklm_c3eaddc5-e6c1-45aa-a952-0c7d74359e05/kube-rbac-proxy/0.log" Apr 16 23:19:34.242734 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.242704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/ovn-controller/0.log" Apr 16 23:19:34.276376 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.276341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/ovn-acl-logging/0.log" Apr 16 23:19:34.294802 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.294767 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/kube-rbac-proxy-node/0.log" Apr 16 23:19:34.314079 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.314049 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 23:19:34.330939 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.330907 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/northd/0.log" Apr 16 23:19:34.351475 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.351443 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/nbdb/0.log" Apr 16 23:19:34.370952 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.370921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/sbdb/0.log" Apr 16 23:19:34.484493 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:34.484461 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l8m6g_55cc55b1-436a-4a28-81cd-ec449dee73fb/ovnkube-controller/0.log" Apr 16 23:19:35.695370 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:35.695319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-7qnvc_af33378f-5a90-42ce-933e-c366ea1cbfb6/check-endpoints/0.log" Apr 16 23:19:35.762438 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:35.762405 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kd4tx_c5bcd735-429b-49bf-8436-33eb976d199a/network-check-target-container/0.log" Apr 16 23:19:36.601501 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:36.601476 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xbw9g_4337d15c-2043-4ee1-a3d7-09710bf7d026/iptables-alerter/0.log" Apr 16 23:19:37.192367 ip-10-0-133-183 kubenswrapper[2576]: I0416 23:19:37.192338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-mfrqz_6bcc3e13-0c5e-4d1f-bbf6-d609cae55a74/tuned/0.log"